Cutshort logo
US based Product MNC logo
Oracle PL/SQL Developer
US based Product MNC's logo

Oracle PL/SQL Developer

3 - 10 yrs
₹12L - ₹20L / yr
Remote, Bengaluru (Bangalore)
Skills
Oracle
Oracle SQL Developer
PL/SQL
Oracle Application Express (APEX)
skill iconJavascript
skill iconHTML/CSS
• Minimum of 3 years of work experience.
• Experience working in complex enterprise solutions is a plus.
• Very strong Oracle PL/SQL and SQL are required.
• Strong CSS/JQuery/JavaScript would be needed.
• Experience with Oracle APEX development is required.
• Experience using Bitbucket /GitHub for version controlling
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About US based Product MNC

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

6sense
at 6sense
15 recruiters
Romesh Rawat
Posted by Romesh Rawat
Remote only
9 - 15 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more

About Us:

6sense is a Predictive Intelligence Engine that is reimagining how B2B companies do

sales and marketing. It works with big data at scale, advanced machine learning and

predictive modelling to find buyers and predict what they will purchase, when and

how much.

6sense helps B2B marketing and sales organizations fully understand the complex ABM

buyer journey. By combining intent signals from every channel with the industry’s most

advanced AI predictive capabilities, it is finally possible to predict account demand and

optimize demand generation in an ABM world. Equipped with the power of AI and the

6sense Demand PlatformTM, marketing and sales professionals can uncover, prioritize,

and engage buyers to drive more revenue.

6sense is seeking a Staff Software Engineer and data to become part of a team

designing, developing, and deploying its customer-centric applications.

We’ve more than doubled our revenue in the past five years and completed our Series

E funding of $200M last year, giving us a stable foundation for growth.


Responsibilities:

1. Own critical datasets and data pipelines for product & business, and work

towards direct business goals of increased data coverage, data match rates, data

quality, data freshness

2. Create more value from various datasets with creative solutions, and unlocking

more value from existing data, and help build data moat for the company3. Design, develop, test, deploy and maintain optimal data pipelines, and assemble

large, complex data sets that meet functional and non-functional business

requirements

4. Improving our current data pipelines i.e. improve their performance, SLAs,

remove redundancies, and figure out a way to test before v/s after roll out

5. Identify, design, and implement process improvements in data flow across

multiple stages and via collaboration with multiple cross functional teams eg.

automating manual processes, optimising data delivery, hand-off processes etc.

6. Work with cross function stakeholders including the Product, Data Analytics ,

Customer Support teams for their enablement for data access and related goals

7. Build for security, privacy, scalability, reliability and compliance

8. Mentor and coach other team members on scalable and extensible solutions

design, and best coding standards

9. Help build a team and cultivate innovation by driving cross-collaboration and

execution of projects across multiple teams

Requirements:

 8-10+ years of overall work experience as a Data Engineer

 Excellent analytical and problem-solving skills

 Strong experience with Big Data technologies like Apache Spark. Experience with

Hadoop, Hive, Presto would-be a plus

 Strong experience in writing complex, optimized SQL queries across large data

sets. Experience with optimizing queries and underlying storage

 Experience with Python/ Scala

 Experience with Apache Airflow or other orchestration tools

 Experience with writing Hive / Presto UDFs in Java

 Experience working on AWS cloud platform and services.

 Experience with Key Value stores or NoSQL databases would be a plus.

 Comfortable with Unix / Linux command line

Interpersonal Skills:

 You can work independently as well as part of a team.

 You take ownership of projects and drive them to conclusion.

 You’re a good communicator and are capable of not just doing the work, but also

teaching others and explaining the “why” behind complicated technical

decisions.

 You aren’t afraid to roll up your sleeves: This role will evolve over time, and we’ll

want you to evolve with it

Read more
Innovalus Technologies
at Innovalus Technologies
5 recruiters
Agency job
via Innovalus Technologies by Devarani Ganesh
Hyderabad
7 - 10 yrs
₹15L - ₹24L / yr
Oracle
SAP BI
ETL
skill iconJava
Informatica
+1 more
Data warehouse Engineer
(Telecom Domine)
Currently Hands-on experience technically as a ETL Developer + lead a team with 7 years of overall experience.
Design and build SAP-BI (Business Intelligence) Data Universes and Reports.
Java, Python, JavaScript, Microservices.
Experienced ODI (Oracle Data Integrator) developer.
Linux/ UNIX/ Shell scripts

Location Hyderabad


Read more
Chennai
5 - 14 yrs
₹13L - ₹21L / yr
skill iconPython
skill iconJava
PySpark
skill iconJavascript
Hadoop

Python + Data scientist : 
• Hands-on and sound knowledge of Python, Pyspark, Java script

• Build data-driven models to understand the characteristics of engineering systems

• Train, tune, validate, and monitor predictive models

• Sound knowledge on Statistics

• Experience in developing data processing tasks using PySpark such as reading,

merging, enrichment, loading of data from external systems to target data destinations

• Working knowledge on Big Data or/and Hadoop environments

• Experience creating CI/CD Pipelines using Jenkins or like tools

• Practiced in eXtreme Programming (XP) disciplines 

Read more
GradMener Technology Pvt. Ltd.
Pune
2 - 5 yrs
₹3L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
Oracle
Job Description : 
 
Roles and Responsibility : 
 
  • Designing and coding the data warehousing system to desired company specifications 
  • Conducting preliminary testing of the warehousing environment before data is extracted
  • Extracting company data and transferring it into the new warehousing environment
  • Testing the new storage system once all the data has been transferred
  • Troubleshooting any issues that may arise
  • Providing maintenance support
  • Consulting with data management teams to get a big-picture idea of the company’s data storage needs
  • Presenting the company with warehousing options based on their storage needs
Requirements :
  • Experience of 1-3 years in Informatica Power Center
  • Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
  • Knowledge of SQL Server database 
  • Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques  Understanding of ETL Control Framework
  • Experience in UNIX shell/Perl Scripting
  • Good communication skills, including the ability to write clearly
  • Able to function effectively as a member of a team 
  • Proactive with respect to personal and technical development
Read more
Mumbai
5 - 8 yrs
₹25L - ₹30L / yr
SQL Azure
ADF
Azure data factory
Azure Datalake
Azure Databricks
+13 more
As a hands-on Data Architect, you will be part of a team responsible for building enterprise-grade
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Read more
Sportz Interactive
Remote, Mumbai, Navi Mumbai, Pune, Nashik
7 - 12 yrs
₹15L - ₹16L / yr
skill iconPostgreSQL
PL/SQL
Big Data
Optimization
Stored Procedures

Job Role : Associate Manager (Database Development)


Key Responsibilities:

  • Optimizing performances of many stored procedures, SQL queries to deliver big amounts of data under a few seconds.
  • Designing and developing numerous complex queries, views, functions, and stored procedures
  • to work seamlessly with the Application/Development team’s data needs.
  • Responsible for providing solutions to all data related needs to support existing and new
  • applications.
  • Creating scalable structures to cater to large user bases and manage high workloads
  • Responsible in every step from the beginning stages of the projects from requirement gathering to implementation and maintenance.
  • Developing custom stored procedures and packages to support new enhancement needs.
  • Working with multiple teams to design, develop and deliver early warning systems.
  • Reviewing query performance and optimizing code
  • Writing queries used for front-end applications
  • Designing and coding database tables to store the application data
  • Data modelling to visualize database structure
  • Working with application developers to create optimized queries
  • Maintaining database performance by troubleshooting problems.
  • Accomplishing platform upgrades and improvements by supervising system programming.
  • Securing database by developing policies, procedures, and controls.
  • Designing and managing deep statistical systems.

Desired Skills and Experience  :

  • 7+ years of experience in database development
  • Minimum 4+ years of experience in PostgreSQL is a must
  • Experience and in-depth knowledge in PL/SQL
  • Ability to come up with multiple possible ways of solving a problem and deciding on the most optimal approach for implementation that suits the work case the most
  • Have knowledge of Database Administration and have the ability and experience of using the CLI tools for administration
  • Experience in Big Data technologies is an added advantage
  • Secondary platforms: MS SQL 2005/2008, Oracle, MySQL
  • Ability to take ownership of tasks and flexibility to work individually or in team
  • Ability to communicate with teams and clients across time zones and global regions
  • Good communication and self-motivated
  • Should have the ability to work under pressure
  • Knowledge of NoSQL and Cloud Architecture will be an advantage
Read more
Mobile Programming LLC
at Mobile Programming LLC
1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Pune, Bengaluru (Bangalore), Hyderabad, Chennai
3 - 8 yrs
₹2L - ₹9L / yr
Data engineering
Data engineer
Spark
Apache Spark
Apache Kafka
+13 more

Responsibilities for Data Engineer

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

Qualifications for Data Engineer

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:

  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
Velocity Services
Bengaluru (Bangalore)
4 - 8 yrs
₹20L - ₹35L / yr
Data engineering
Data Engineer
Big Data
Big Data Engineer
skill iconPython
+10 more

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 3+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 2+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

 

 

Read more
Data
at Data
Agency job
via parkcom by Ravi P
Pune
6 - 15 yrs
₹7L - ₹15L / yr
ETL
Oracle
Talend
Ab Initio

We are looking for a Senior Database Developer to provide a senior-level contribution to design, develop and implement critical business enterprise applications for marketing systems 

 

  1. Play a lead role in developing, deploying and managing our databases (Oracle, My SQL and Mongo) on Public Clouds.
  2. Design and develop PL/SQL processes to perform complex ETL processes. 
  3. Develop UNIX and Perl scripts for data auditing and automation. 
  4. Responsible for database builds and change requests.
  5. Holistically define the overall reference architecture and manage its overall implementation in the production systems.
  6. Identify architecture gaps that can improve availability, performance and security for both productions systems and database systems and works towards resolving those issues.
  7. Work closely with Engineering, Architecture, Business and Operations teams to provide necessary and continuous feedback.
  1. Automate all the manual steps for the database platform.
  2. Deliver solutions for access management, availability, security, replication and patching.
  3. Troubleshoot application database performance issues.
  4. Participate in daily huddles (30 min.) to collaborate with onshore and offshore teams.

 

Qualifications: 

 

  1. 5+ years of experience in database development.
  2. Bachelor’s degree in Computer Science, Computer Engineering, Math, or similar.
  3. Experience using ETL tools (Talend or Ab Initio a plus).
  4. Experience with relational database programming, processing and tuning (Oracle, PL/SQL, My SQL, MS SQL Server, SQL, TSQL).
  5. Familiarity with BI tools (Cognos, Tableau, etc.).
  6. Experience with Cloud technology (AWS, etc.).
  7. Agile or Waterfall methodology experience preferred.
  8. Experience with API integration.
  9. Advanced software development and scripting skills for use in automation and interfacing with databases.
  10. Knowledge of software development lifecycles and methodologies.
  11. Knowledge of developing procedures, packages and functions in a DW environment.
  12. Knowledge of UNIX, Linux and Service Oriented Architecture (SOA).
  13. Ability to multi-task, to work under pressure, and think analytically.
  14. Ability to work with minimal supervision and meet deadlines.
  15. Ability to write technical specifications and documents.
  16. Ability to communicate effectively with individuals at all levels in the company and with various business contacts outside of the company in an articulate, professional manner.
  17. Knowledge of CDP, CRM, MDM and Business Intelligence is a plus.
  18. Flexible work hours.

 

This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

 

Read more
Home Credit
NCR (Delhi | Gurgaon | Noida)
3 - 5 yrs
₹6L - ₹11L / yr
Business Intelligence (BI)
skill iconData Analytics
Analytics
Oracle SQL Developer
PowerBI
+4 more
Role Summary - The position holder will be responsible for supporting various aspects of organization's Analytical & BI activities. As a member of team, candidate will collaborate with a multi-disciplinary team of experts and SMT group on a wide range of problems which will give him opportunities to solve critical business problems by using Analytical & Statistical techniques. Essential/Key Responsibilities - By analyzing data/reports to identify early warning signals (unusual trends, patterns, Process gaps etc.) and proactively providing feedback in order to take corrective actions by finding continuous improvement in process (improvement in performance, reducing cost, technological improvement etc.) - Will also be responsible for creating/defining BI (Business Intelligence) and AI (Analytical Intelligence) standards for Home Credit - Being a part of BICC team, expecting high level of Business Intelligence support (Regular Reports, weekly presentations etc.) to top management - Will ensure automation & centralization of BI activities for better utilization of resources - Will be responsible for supporting data driven ADHOC's & critical requirements Qualifications/Requirements: - MBA/ M. Tech / B-Tech or Bachelors in a quantitative discipline such as Computer Science, Engineering, Mathematics, Statistics, Operations Research, Economics from premier /Tier 1 Colleges with a minimum of 3 years of experience in Analytics/ Business Intelligence - Highly numerate/ Statistical knowledge - able to work with numbers and can understand the data trend - Ability to work with both business and technical communities - Good to know, financial analysis / modeling to support the various teams on specific analysis projects. Skills/ Desired Characteristics - Able to think analytically, use a systematics and logical approach to analyze data, problems and situations. - Good Database skills with exposure to Oracle (11g) systems and tools - Highly skilled in Excel, SQL, R/Python or Power BI /Tableau or VBA - Ability to manage multiple deliverables with minimum guidance and pro-actively set up communication processes with stakeholders - Willing to working in IC (Individual Contributor) role - Excellent communication skills in English - written, verbal - Good knowledge in Project Management and Program management. Who should join us - If you are willing to face new challenges and want to apply your data knowledge for growth / future of company, then Home Credit can give you this opportunity. Home Credit can provide you platform to show your skills & suggest valuable ideas to company. - Will get opportunity to work on company level platform & will be part of BI platform of company. - Opportunity to work in a team of enthusiastic professionals.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos