11+ SAP IS-U Jobs in Bangalore (Bengaluru) | SAP IS-U Job openings in Bangalore (Bengaluru)
Apply to 11+ SAP IS-U Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SAP IS-U Job opportunities across top companies like Google, Amazon & Adobe.
We are hiring for Package Consultant - SAP IS Oil& Gas.
Job description:
SAP IS Oil TD/TSW/Excise Duty consultant IS OIL Consultant with MM knowledge. Functional consultant to design and implement Truck/Logistics solution and setting up new legal entities as part of customer growth strategy in new countries. Good communication and stakeholder management skills
Experience:5+Years
Warm Regards
Abha Kumari
Senior HR Executive
Codersbrain Technology Pvt Ltd
www.codersbrain.com
We are looking for an experienced GCP Cloud Engineer to design, implement, and manage cloud-based solutions on Google Cloud Platform (GCP). The ideal candidate should have expertise in GKE (Google Kubernetes Engine), Cloud Run, Cloud Loadbalancer, Cloud function, Azure DevOps, and Terraform, with a strong focus on automation, security, and scalability.
You will work closely with development, operations, and security teams to ensure robust cloud infrastructure and CI/CD pipelines while optimizing performance and cost.
Key Responsibilities:
1. Cloud Infrastructure Design & Management
· Architect, deploy, and maintain GCP cloud resources via terraform/other automation.
· Implement Google Cloud Storage, Cloud SQL, file store, for data storage and processing needs.
· Manage and configure Cloud Load Balancers (HTTP(S), TCP/UDP, and SSL Proxy) for high availability and scalability.
· Optimize resource allocation, monitoring, and cost efficiency across GCP environments.
2. Kubernetes & Container Orchestration
· Deploy, manage, and optimize workloads on Google Kubernetes Engine (GKE).
· Work with Helm charts, Istio, and service meshes for microservices deployments.
· Automate scaling, rolling updates, and zero-downtime deployments.
3. Serverless & Compute Services
· Deploy and manage applications on Cloud Run and Cloud Functions for scalable, serverless workloads.
· Optimize containerized applications running on Cloud Run for cost efficiency and performance.
4. CI/CD & DevOps Automation
· Design, implement, and manage CI/CD pipelines using Azure DevOps.
· Automate infrastructure deployment using Terraform, Bash and Power shell scripting
· Integrate security and compliance checks into the DevOps workflow (DevSecOps).
Required Skills & Qualifications:
✔ Experience: 8+ years in Cloud Engineering, with a focus on GCP.
✔ Cloud Expertise: Strong knowledge of GCP services (GKE, Compute Engine, IAM, VPC, Cloud Storage, Cloud SQL, Cloud Functions).
✔ Kubernetes & Containers: Experience with GKE, Docker, GKE Networking, Helm.
✔ DevOps Tools: Hands-on experience with Azure DevOps for CI/CD pipeline automation.
✔ Infrastructure-as-Code (IaC): Expertise in Terraform for provisioning cloud resources.
✔ Scripting & Automation: Proficiency in Python, Bash, or PowerShell for automation.
✔ Security & Compliance: Knowledge of cloud security principles, IAM, and compliance standards.
EXCITED ABOUT YOUR TASKS?
● Employee Records Management:
○ Maintain and update employee records, ensuring accuracy and confidentiality.
○ Prepare and process employment documentation, including new hire, termination, and promotion
forms.
○ Manage HRIS (Human Resources Information System) to ensure all records are current.
● Recruitment Support:
○ Assist with the recruitment process by posting job openings, scheduling interviews, and
coordinating candidate communications.
○ Help onboard new employees, including preparing induction materials, facilitating orientation, and
ensuring all necessary paperwork is completed.
● Payroll and Benefits Administration:
○ Assist with the preparation and processing of payroll data.
○ Ensure proper documentation for employee benefits enrollment, changes, and claims.
○ Coordinate with finance for payroll queries and discrepancies.
●HR Compliance and Reporting:
○ Support HR with compliance tasks, such as ensuring the company adheres to labor laws and
employment regulations.
○ Prepare HR reports, including headcount, turnover rates, and other metrics as needed.
● Employee Relations Support:
○ Provide general administrative support to HR managers and employees.
○ Respond to employee inquiries regarding HR policies and procedures.
● Training and Development:
○ Assist in organizing training sessions, workshops, and other employee development activities.
○ Maintain records of training and professional development activities.
● General Administrative Tasks:
○ Assist in managing office supplies, HR documents, and communication materials.
○ Coordinate meetings, appointments, and other scheduling activities for the HR department.
WHAT WILL YOU NEED TO SUCCEED?
● Bachelor’s degree in Human Resources, Business Administration, or related field.
● Proven experience as an HR Administrator or in a similar administrative role.
● Strong knowledge of HR software, MS Office Suite (Excel, Word, PowerPoint), and HRIS systems.
● Excellent organizational, time-management, and multitasking skills.
● Attention to detail and a high degree of confidentiality.
● Familiarity with employment laws and regulations.
● Effective communication skills, both written and verbal.
● Problem-solving ability and a proactive approach to tasks.
Working Days:
Monday to Friday 10 am - 7 pm (WFO)
Saturdays-10 am - 3 pm (WFH)
However, the candidate must be comfortable with flexible shifts, if required.
Looking for Indiranagar, Bangalore location ONLY
FEMALE CANDIADTES PREFERRED ONLY

Looking for Linux BSP Engineers for one of our clients
Skills Required: C Programming, Device Driver Development, BSP, U-Boot, Board Bring-up
Location: Bangalore
Experience 3 to 6yrs
Education/Qualification: B.E/B.Tech (EEE / ECE)
Domain: - Bootloader, Linux BSP, Device driver for Ethernet, PCIe, USB, etc
○ U-Boot and Linux porting / upgrading to ARM based SoC
○ Device drivers for SPI, I2C, Touch screen, MEMS Sensors
○ Device drivers for Audio, camera, display etc.
Additional considerations:-
Must have: - Ground up driver development and debugging experience
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
-
4 -6 yrs. of relevant Salesforce development experience.
-
Experience on the Salesforce Lightning Platform including Process Builder, Workflows, Lightning App Builder, etc.
-
Experience with Salesforce Development including Lightning Components, APEX, SOQL, and SOSL.
-
Experience with Pardot form/template development and integration.
-
Thorough with Community portal and customization.
-
Experience with assessing Salesforce patching and critical updates.
-
Certified as Salesforce platform developer and actively renewing.
-
Salesforce CPQ and Product/Price book configuration experience
-
Proficiency with Web Application Development including JavaScript, HTML and CSS – (Optional)
-
Proficiency with server side languages such as Python, Ruby, Java, PHP and .Net. – (Optional)
-
Experience with Technical Architecture for Salesforce.com
-
Excellent verbal communication skills.
-
Experience with DevOps and CI/CD technologies
-
Good problem-solving skills.
-
Ability to provide Salesforce custom solution and provide oversight on improvements.
-Ability to quickly adapt to changes.
-Additional Preferences or certifications: Certified as Salesforce platform developer and actively renewing.
-Soft skills required: Good Communication Skills, client facing experience.
-Flexibility with timings as we are supporting US based clients and there are deployments during non-business hours and weekends
• 2 - 5 years of experience building React and/or Mobile Applications
• 5-8 years working with microservices, API servers, databases, cloud-native development, observability,
alerting, and monitoring
• Deep exposure to cloud services, preferably Azure
• Preferably worked in the Finance/Retail domain or other similar domains with complex business
requirements.
• Hands-on skills combined with leadership qualities to guide teams.
Location – Bangalore, Mumbai, Gurgaon
Functional / Technical Skills:
• Strong understanding of networking fundamentals
o OSI Stack, DNS, TCP protocols
o Browser rendering and various stages of execution
• Good understanding of RESTful APIs, GraphQL and Web Sockets
• Ability to debug and profile Web/Mobile applications with Chrome DevTools or Native profilers
• Strong understanding of Distributed Systems, Fault Tolerance and Resiliency. Exposure to setting up
and managing Chaos is a plus.
• Exposure to Domain Driven Design (DDD), SOLID principles, and Data Modelling on various RDBMS,
NoSQL databases.
• Ability to define and document performance goals, SLAs, and volumetrics. Creating a framework for
measuring and validating the goals. Work with teams to implement and meet them.
• Create automation scripts to measure performance. Making this part of the CI/CD process.
• Good understanding of CNCF projects with a specific focus on Observability, Monitoring, Tracing,
Sidecars, Kubernetes
• Tuning of Cloud-native deployments with a focus on Cost Optimization.
• Participate in architecture reviews to identify potential issues, and bottlenecks and provide early guidance.
• Deep knowledge of at least 2 different programming languages and runtimes. Any two of Ruby, Python,
Swift, Go, Rust, C#, Dart, Kotlin, Java, Haskell, OCaml
• Excellent verbal and written communication
• A mindset to constantly learn new things and challenge the Status Quo.
~Lead conversion
~Revenue generation
~Proactively identifying cross-selling /Up-selling process with the existing ~customers
- Work Mode: Bangalore (Work from Office)
- Qualification: Graduation (Tech Background Preferable)
About SLAY
SLAY Coffee is India's best rated and fastest growing coffee brand with a vision to democratise great coffee experiences through technology & innovative customer experiences. With 3 core lines of business (SLAY Cloud for Online, SLAY Coffee Bars for Grab & Go and SLAY Packaged Coffees for home consumption), SLAY Coffee’s core promise is sophisticated convenience at pocket friendly prices. The brand is 2-years old and currently has a footprint of over 150 locations across 8 cities.
Culture at SLAY
Excellence is our North Star and our culture fuels this pursuit. We work together as a sports team*; each member is united by a common purpose and drives excellence by bringing their unique self to the team.
We aim to build and nurture a team that finds meaning and fulfillment in the work they do at SLAY. If who you are and what you value is in alignment with what SLAY is and what we value, then the pursuit of excellence becomes meaningful.
The following principles guide us in our day to day working at SLAY:
- Customer first
- Transparency
- Simplicity in thought and action
- Long term thinking
- Passion
- Data over opinion, company and team over self
- Integrity and respect
- Communication for impact
Core Values
Our core values define how we operate and bring about an alignment of purpose. We learn and get inspired from each other in strengthening these behaviors & skills both at an individual level and collectively at the Company level. New members are added to the team and existing team members are rewarded and developed based on these values.
- Courage:
- To think big, act without fear and voice your opinion.
- To question the status quo- go beyond rules and hierarchies.
- To act in the best interest of the customer and company, even if it calls for going outside your comfort zone.
- Ownership:
- Seek to understand the context and objective, then drive for results
- Collaborate independently across teams
- Communicate proactively. Eliminate the need for follow ups
- Act with authority
- Action
- Prioritize action over discussions
- Run small pilots for each idea
- Make mistakes, learn, don't repeat.
- Curiosity
- To understand our customers and markets deeply beyond the stated
- To push the boundaries and innovate
- To never settle for less
Beyond the salary (and great coffee!)
- Freedom to test out any idea in your function or outside of it.
- Gender-neutral policies
- Free Books Program- buy any book that you would like to read for the company library. The Company pays for it.
- 25% of your time can be devoted to the causes you care about. (Pursue your passion or hobby)
- Exclusive talks & insights from the experts for the growth & development of our team
- Full tuition fees sponsorship for Baristas
- Family support programs for Baristas
Opportunity
As a team member in SLAY’s growth team, you will work on projects to grow the brand’s reach, resonance and resultant goodness.
Your Boss
You will work directly with the Co-founder . This position is being hired for the Centre of Excellence team at SLAY.
Your Team
You will collaborate with business teams to develop in house training & learning strategies, creative teams to develop impactful creatives, operations teams for execution & external partners to set up and manage specific learning and development initiatives..
Key Result Areas
- Drive technology first learning interventions for all members of the organization.
- Design and deploy the end to end training framework from training needs identification to continuous assessments and certifications.
- Develop content using instructional design methodologies. Suggest and deploy new age and industry first tools and systems.
- Own the competency dictionary of the organization and drive brand values and culture through culture philosophy through all training and development activities.
- Manage and guide the training team which will include instructional designers, content developers, training coordinators etc.
- Keep abreast of training trends, developments and best practices of the Industry.
Your role requires knowledge and demonstrable experience of:
- 6+ years of experience in developing and executing successful training initiatives.
- Familiar with traditional and modern training processes
- Meticulous attention to detail
- Ability to plan, multi-task and manage time effectively
- Entrepreneurial with the ability to think of industry first and rule breaking initiatives
- Organizational skills
Next Steps:
If you wish to explore this opportunity further,
- Send us a copy of your latest resume
- A quick note on what excites you about working with SLAY. You can do this as an audio file, video file or a simple write up as well.
- We will schedule one to one interactions with your potential boss, team members and peers.

It is an IT infrastructure company located in Bangalore.
We are seeking API Test Engineer who has at least 6 years of experience in Software Quality Assurance. In this role you will participate with the processes, tools, techniques and practices for assuring adherence to quality standards for laboratory medical device new product development. Candidates must have database and coding skills.
Responsibilities
- Writes and automates software testing through API / Integration Tests
- Ensures deliverables meet audit criteria. Interfaces with Scum Team to ensure that systems are developed meeting the business need and specifications through API / Integration tests.
- Develops, publishes, and implements test scripts to ensure quality applications
- Work with multi-discipline teams on new production introduction project while adhering to software development and source control processes
Qualifications
- Bachelor’s degree in Software Engineering, Computer Science or related field with no less than 6+ years of experience
- Experience with any automated API / Integration test technologies such as FitNess / REST API / Cucumber / Other
- Candidates must be able to do API and database testing
- Communicates and endorses strict adherence to methodologies, processes, and standards
- Performs risk-based testing
- Maintains defect documentation and logs
- Experience in Agile development processes and philosophies
- History of timely delivery while ensuring a quality focus
- Ability to work well with people and be both highly motivated and motivating
- Ability to work in a fast-paced, and often ambiguous environment where continuous improvement is a way of life
- Ability to work independently and proactively with minimal direction
Nice to Have
- Understanding of software development processes for a regulated environment (ISO9000/FDA) is a plus
Sales Specialist
(Outbound Sales Specialist)
Experience - 3 to 4 years
Location - Pune & Bangalore
We are looking for a talented and competitive Sales Specialist who thrives in a quick sales cycle environment. If you're someone who loves travelling and has a strong desire to make a career in IT Sales, then you have reached the right place. The ideal candidate should have the hunger to perform consistently and meet expectations. You must be comfortable meeting end customers and discuss and understand mobility requirements.
Key Responsibilities
The individual role that you’ll play in our team:
- Establish and meet annual sales quotas for regions and territories; projecting expected sales volume for direct & through partner accounts.
- Meeting end customers to discuss their IT Mobility needs and requirements, Should be able to map customer organigram and propose the solution based on needs.
- Should be able to manage OEMs/Reseller/System Integration Partners for product promotion and push for targets.
- Should be ready for Travels within South India regions.
- Manage existing and potential new accounts as well as provide sales support to distribution partners to participate in closing and order or to facilitate and add value to the selling process.
- Should be a team player and working with Internal stakeholders/Allied teams.
- Develop and implement new sales initiatives, strategies and programs to capture new customers
- Should be good at communication, Knowledge on using CRM is an added advantage.
- Maintain records of all sales leads and customer accounts
- Monitor the company’s industry competitors, new products, and market conditions to understand a customer's specific needs.
- Develop and sustain knowledge of Product features,offerings and associated technologies.
- You will need to be proactive and manage your own sales pipeline.
What we want to see in the potential Candidate
- Strong verbal and written communication skills.
- Excellent selling and negotiation skills
- Great listening skills and a desire to learn proper consultative selling techniques.
- Sound knowledge of IT Mobility industry
- Self starter
- High-energy and positive attitude
- Ability to quickly learn and apply new information in customer-facing scenarios
- The ability to write reports and proposals
- The capacity to work well on your own or in a team
- The ability to manage your time and plan your day effectively
- Should have prior experience with a B2B product company
- Has good experience in SAAS B2B sales would be an added advantage
- MDM Domain understanding would be an added advantage.
Scalefusion (formerly known as Mobilock Pro): (Our Flagship Product)
Scalefusion MDM allows organizations to secure & manage endpoints including smartphones, tablets, laptops, rugged devices, mPOS, and digital signages, along with apps and content. It supports the management of Android, iOS, macOS and Windows 10 devices and ensures streamlined device management operations with InterOps. Fusion of Endpoints at Scale.
Promobi Technologies:
ProMobi Technologies provides a leading Mobile Device Management Solution under the brand Scalefusion. The solution allows organizations to manage Android and iOS devices from the cloud. It offers modern mobile device management (MDM), application management (MAM) and content management (MCM) experience for corporate-owned devices. Renowned organizations from startups to Fortune 500 trust Scalefusion for their Device Management.


