11+ RPD Jobs in Bangalore (Bengaluru) | RPD Job openings in Bangalore (Bengaluru)
Apply to 11+ RPD Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest RPD Job opportunities across top companies like Google, Amazon & Adobe.
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.
Key Responsibilities:
● Analyze and translate legacy MSSQL stored procedures into Snowflake Scripting (SQL) or JavaScript-based stored procedures.
● Rebuild and optimize data pipelines and transformation logic in Snowflake.
● Implement performance-tuning techniques such as query pruning, clustering keys, appropriate warehouse sizing, and materialized views.
● Monitor query performance using the Snowflake Query Profile and resolve bottlenecks.
● Ensure procedures are idempotent, efficient, and scalable for high-volume workloads.
● Collaborate with architects and data teams to ensure accurate and performant data migration.
● Write test cases to validate functional correctness and performance.
● Document changes and follow version control best practices (e.g., Git, CI/CD).
Required Skills:
● 4+ years of SQL development experience, including strong T-SQL proficiency.
● 2+ years of hands-on experience with Snowflake, including stored procedure development.
● Deep knowledge of query optimization and performance tuning in Snowflake.
● Familiarity with Snowflake internals: automatic clustering, micro-partitioning, result caching, and warehouse scaling.
● Solid understanding of ETL/ELT processes, preferably with tools like DBT, Informatica, or Airflow.
● Experience with CI/CD pipelines and Git-based version control
Note : One face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.
Technical Project Manager
As a Technical Project Manager, you will be leading a team to build a highly scalable and extensible big data platform that provides the foundation for collecting, storing, modelling, and analysing massive data sets from multiple channels.
Responsibilities
1. Align Sigmoid with key Client initiatives
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements.
- Connect with CIO, VP, and Director level clients on a regular basis.
- Travel to client locations.
- Ability to understand business requirements and tie them to technology solutions.
2. Build a delivery plan with domain experts and stay on track
- Design, develop and evolve highly scalable and fault-tolerant distributed components using Big Data technologies.
- Excellent experience in application development and support, integration development, and data management.
3. Build team and manage it on a day-to-day basis
- Play the key role of hiring manager to build the future of Sigmoid.
- Guide developers in day-to-day design and coding tasks, stepping into code if needed.
- Define your team structure, hire, and train your team as needed.
4. Stay up to date on the latest technology to ensure the greatest ROI for customer & Sigmoid
- Hands-on coder with good understanding of enterprise-level code.
- Design and implement APIs, abstractions, and integration patterns to solve challenging distributed computing problems.
- Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment.
5. Culture
- Must be a strategic thinker with the ability to think unconventional / out-of-box.
- Analytical and data-driven orientation.
- Raw intellect, talent, and energy are critical.
- Entrepreneurial and Agile: understands the demands of a private, high-growth company.
- Ability to be both a leader and hands-on "doer".
Qualifications
- 7+ years track record of relevant work experience and a Computer Science or related technical discipline is required.
- Dynamic leader who has directly managed a team of highly competent developers in a fast-paced work environment.
- Experience in architecture and delivery of enterprise-scale applications.
Preferred Qualifications
- Experience in Agile methodology.
- Development and support experience in Big Data domain.
- Architecting, developing, implementing, and maintaining Big Data solutions.
- Experience with database modelling and development, data mining, and warehousing.
- Experience with Hadoop ecosystem (HDFS, MapReduce, Oozie, Hive, Impala, Spark, Kerberos, Kafka, etc).
Salary (Lacs): Up to 22 LPA
Required Qualifications
• 4–7 years of total experience, with a minimum of 4 years in a full-time DevOps role
• Hands-on experience with major cloud platforms (GCP, AWS, Azure, OCI), more than one will be a plus
• Proficient in Kubernetes administration and container technologies (Docker, containerd)
• Strong Linux fundamentals
• Scripting skills in Python and shell scripting
• Knowledge of infrastructure as code with hands-on experience in Terraform and/or Pulumi (mandatory)
• Experience in maintaining and troubleshooting production environments
• Solid understanding of CI/CD concepts with hands-on experience in tools like Jenkins, GitLab CI, GitHub Actions, ArgoCD, Devtron, GCP Cloud Build, or Bitbucket Pipelines
If Interested kindly share your updated resume on 82008 31681
Job Title: Senior Social Media Marketer
Experience: 3+ years dominating in social media marketing. Bonus points if you've crushed it
in the US beauty and wellness scene!
Responsibilities:
● Craft a killer social media strategy to capture customers like a boss
● Plan and execute epic organic and inorganic campaigns that leave a lasting
impression
● Build a massive social media following and foster a tight-knit community of brand
enthusiasts
● Collaborate with top influencers and other rad brands to create off-the-hook
campaigns
● Stay ahead of the game by constantly monitoring and analyzing campaign success
and making tweaks for maximum impact
● Keep your pulse on the latest social media trends and innovations so your
campaigns always stay fresh
● Work with a dream team of cross-functional rockstars to integrate social media into
the overall marketing strategy
Requirements:
● A proven track record of launching successful social media campaigns
● A deep understanding of the unique capabilities of each social media platform
● Excellent communication skills and the ability to slay a presentation
● An endless well of creativity and the ability to bring your wildest social media visions
to life
● Experience using social media analytics to measure success and optimize
campaigns
● A passion for working with influencers and partners to create truly epic campaigns
● The ability to thrive in a fast-paced, dynamic environment
If you're a Senior Social Media Guru with a killer instinct for capturing customers, we want to
hear from you
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Want to work with an established & growing IT company? Join team Benison to have the right challenges that will help you accelerate your career growth to the next level, faster!
Benison Technologies was started in 2011 with a mission to revolutionize the silicon industry in India, with a host of amazing big clients like Google, Cisco, McAfee, Intel, and so on, you get to experience the best of both worlds. If you consider yourself an engineer who is capable to join our ever-growing team, then this is the right opportunity for you:
Why Benison Tech?
We have a partial acquisition from one of the biggest names in the world (well we can’t name them thanks to confidentiality) it’s one of the FAANG companies, and you can “Google” it if you like.
Oh! & one more thing, this did not happen by accident, our team put a ton of efforts to turn this gigantic dream into a reality.
Benison Tech has a consistent history of demonstrating growth through innovation time and again.
We don’t stop there, we then re-invest our profits back into the initiatives for the growth of our people, our culture and the company. Now enough with us, let’s talk about the job roles & responsibilities:
What you will be working on:
- You will be working on the next generation network security products, on various public clouds.
- In addition to development, you will also get your hands involved in the architectural changes while fixing the legacy issues.
- Planning, designing, integration for network security platforms.
- Key contributor for developing product strategies and features.
- You will work on hardcore data networking/forwarding areas while acquiring in depth knowledge in DPDK and VPP.
Here are some technical skills required:
- Programming languages : C/C++
- Work exp on DPDK and packet forwarding area.
- Good understanding of Linux Kernel, Networking stack in the kernel.
- Exposure to clouds like AWS, GCP is a big plus.
- Understanding/Exposure to VPP is a big plus.
- Strong in packet forwarding, Tunneling (IPsec, GRE, VxLAN) etc.
What we expect from you:
- 2+ years of relevant experience
- Design, Develop & test the various features in dpdk, VPP based system Interact with the customer to help in defining the features, execute and deliver them
- Mentor the junior team members.
- Participate in code-reviews and ensure the quality of deliverables.
If the above fits your skill-sets and tickles your interest then read below about the additional benefits that our company offers to talented folks like you:
Work Culture and Benefits
- Competitive salary and benefits package
(H1-B which means a chance to work onsite out of India) - A culture focused on talent development where you get promoted within the quarterly cycle of your anniversary.
- Opportunity to work with cutting-edge & challenging technologies including legacy tech.
- Open cafeteria to grab some munchies while you work, we make sure the space feels like your second home, you can also wear pyjamas if you like.
- Employee engagement initiatives such as project parties, flexible work hours, and long service awards, team bonding activities within the company, extra learning and personal development trainings, because why stop your learning at one thing!
- Insurance coverage: Group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and your parents. (With some of the best insurance partners in India)
- Enjoy collaborative innovation (each members gets to innovate & think out of the box), along with highly experienced team managers who maintain diversity and work-life well-being.
- And of course, you get to work on projects from some of the most recognised brands within the networking and security space of the world, unlocking global opportunities to learn, grow & contribute in a way that is truly impactful yet purposeful at the same time.
Still not satisfied, and want more proof?
Head to our website https://benisontech.com/">https://benisontech.com to learn more.
Experience as a UI/UX Designer as well as a strong portfolio of related projects
• Proficient in Adobe Creative Suite (specifically Illustrator, InDesign and Photoshop)
• Proficient in prototyping tools such as Sketch, InVision, etc.
• Basic HTML5, skills are a plus Should have experience in Figma and photoshop
Job Summary
- BS/BE/BCA/MSC/MCA degree in Computer Science, Engineering or a related subject
- Hands on experience is preferable in designing and developing applications using Java EE platforms
- Object oriented analysis and design using common design patterns.
- Profound insight of Java and J2EE internals
- Excellent knowledge of Relational Databases and SQL
- Experience in developing web applications using at least one popular web framework (JSF, HTML5, MVC)
- Knowledge on Micro services, Containers / Docker would be added advantage.
- Knowledge on data science would be preferred.
- Exposure to building API, rest service and webservices.
- Exposure to open source like Tensor flow, NIFI, Stream pipes etc.,
- Experience with test-driven development
- Good communication skills and client-oriented attitude
- Organized and detail-oriented person
- Problem solving skills, analytical mind and positive attitude
- Results oriented and focused on meeting deliverable timelines
- Availability to travel, if needed
- Fluency in English is a must
Responsibilities and Duties
- Design and develop features and modules for mission-critical applications
- Build modules on MES products like (SAP, Apriso, Rockwell etc.,)
- Contribute in all phases of the development lifecycle
- Write well designed, testable, efficient code
- Ensure designs are in compliance with specifications
- Prepare and produce releases of software components
- Support continuous improvement by investigating alternatives and technologies
Required Experience, Skills and Qualifications
2 - 5 years of hands-on Software Development experience using the below mentioned Technologies
- Java / J2EE
- EJB, JSF, Servlets
- HTML, HTML5
- SQL server / Oracle
- Json, webservice and etc.
Benefits
- Candidate would be Trained on SAP modules.
- Industry best pay.
Cloud Developer
● Overall 6-8 years of IT Experience including Java/.Net based Software Developmentwith
minimum 2-3 years of experience in developing applications on cloud (AWS/Azure/Google)
● Excellent understanding and hands on with cloud computing concepts including
but not limited to microservices, containerization, DevOps etc.
● Excellent knowledge of cloud native computing technologies and current computing trends
● Ability to effectively address Customer NFRs with most suitable cloud/open source servicesavailable
● Updated on latest Cloud offerings
● In depth experience in problem solving, guiding team members on cloud development challenges
● Expertise in preparing technical architecture for cloud development
● Hands on Experience in any one of Multi-cloud/Hybrid Cloud model
implementation utilizing leading platforms like Red Hat OpenShift, GoogleAnthos,
VMware Tanzu
● Implementation experience in leading open source technologies like Spring boot,Spring
Batch, Spring cloud, Drools Rule Engine etc.
● Should be able to understand customer cloud requirements and implement technical solutions
● Experience in designing and implementing reusable components/accelerators
● Ability to participate in solution discussions with customers
● Hands on with DevOps implementation
● Hands On Experience in developing POC’s and pilots is a must
● Experience in cloud CoE will be a added advantage
● Certified developer in AWS, Azure or Google
● Effective communication skills (written and verbal) for seamless cloud based development
"Need candidates with Notice period of 30 - 45 days."
- Experience: 3-5 Years
- Scripting: PowerShell and either of ( JavaScript, Python)
- Kubernetes and docker Hands-On
- Good to have either (Azure / AWS )
- Any of DB technologies Hands-ON: No SQL - Admin, COSMOS DB, MONGO DB, Maria DBA
- Good to have Analytics knowledge





