11+ Requirements management Jobs in Bangalore (Bengaluru) | Requirements management Job openings in Bangalore (Bengaluru)
Apply to 11+ Requirements management Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Requirements management Job opportunities across top companies like Google, Amazon & Adobe.
Job Description: Business Analyst
(Job Code: BNSAN1)
No of Positions: 2
Purpose of Position:
Requirements management is one of the core skills of business analysts. Developing technical solutions to business problems, or to advance a company’s sales efforts, begins with defining, analyzing and documenting requirements. Managing requirements at the project level can help fulfill business needs. Skilled business analysts also use requirements to drive the design or review of test cases, process change requests, and manage a project’s scope, acceptance, installation and deployment
Accountability:
The Business Analyst is directly accountable to the VP of Operations. The Business Analyst will undergo a yearly performance appraisal.
Duties and Responsibilities:
- Lead Role:
- Assisting with the business case
- Planning and monitoring
- Eliciting requirements
- Requirements organization
- Translating and simplifying requirements
- Requirements management and communication
- Requirements analysis
- Responsibilities:
- Implementing advanced strategies for gathering, reviewing and analyzing data requirements
- Prioritising requirements and create conceptual prototypes and mock-ups
- Mastering strategic business process modelling, traceability and quality management techniques
- Applying best practices for effective communication and problem-solving
Qualifications and Skills:
Becoming a successful business analyst takes core business skills and specialized knowledge that will advance a firm’s objectives, and contribute to its remaining competitiveness in a complex economy. Business analysts are typically required to assess and validate their activities and to determine if a solution has fulfilled the requirements and achieve the business benefits in areas of workflow & customer relationship management.
- Written and verbal communication, including technical writing skills
- Understanding of systems engineering concepts
- The ability to conduct cost/benefit analysis
- Business case development
- Modeling techniques and methods
- Leadership Skills
- Excellent math and organizational skills
- Professional, courteous and positive manner
- Ability to set priorities and manage multiple task functions simultaneously
Immediate Joiners Preferred
NOTE: Working Shift: 11 am - 8 pm IST (+/- one hour on need basis) Monday to Friday
Responsibilities :
1. Leverage Alteryx to build, maintain, execute and deliver data assets. This entails using data analytics tools (combination of proprietary tools, Excel, and scripting), validating raw data quality, and working with Tableau/ Power BI and other geospatial data visualization applications
2. Analyze large data sets in Alteryx, finding patterns and providing a concise summary
3. Implement, maintain, and troubleshoot analytics solution and derive key insights based on Tableau/ Power BI visualization and data analysis
4. Improve ETL setup and develop components so they can be rapidly deployed and configured
5. Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools
Qualification :
1. 3+ years of relevant and professional work experience with a reputed analytics firm
2. Bachelor's degree in Engineering / Information Technology from a reputed college
3. Must have the knowledge to handle/design/optimize complex ETL using Alteryx
4. Expertise with visualization tools such as Tableau /Power BI to solve a business problem related to data exploration and visualization
5. Good knowledge in handling large amounts of data through SQL, T-SQL or PL-SQL
6. Basic knowledge in Python for data processing will be good to have
7. Very good understanding of data warehousing and data lake concepts
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Position Overview: We are seeking a talented and experienced Cloud Engineer specialized in AWS cloud services to join our dynamic team. The ideal candidate will have a strong background in AWS infrastructure and services, including EC2, Elastic Load Balancing (ELB), Auto Scaling, S3, VPC, RDS, CloudFormation, CloudFront, Route 53, AWS Certificate Manager (ACM), and Terraform for Infrastructure as Code (IaC). Experience with other AWS services is a plus.
Responsibilities:
• Design, deploy, and maintain AWS infrastructure solutions, ensuring scalability, reliability, and security.
• Configure and manage EC2 instances to meet application requirements.
• Implement and manage Elastic Load Balancers (ELB) to distribute incoming traffic across multiple instances.
• Set up and manage AWS Auto Scaling to dynamically adjust resources based on demand.
• Configure and maintain VPCs, including subnets, route tables, and security groups, to control network traffic.
• Deploy and manage AWS CloudFormation and Terraform templates to automate infrastructure provisioning using Infrastructure as Code (IaC) principles.
• Implement and monitor S3 storage solutions for secure and scalable data storage
• Set up and manage CloudFront distributions for content delivery with low latency and high transfer speeds.
• Configure Route 53 for domain management, DNS routing, and failover configurations.
• Manage AWS Certificate Manager (ACM) for provisioning, managing, and deploying SSL/TLS certificates.
• Collaborate with cross-functional teams to understand business requirements and provide effective cloud solutions.
• Stay updated with the latest AWS technologies and best practices to drive continuous improvement.
Qualifications:
• Bachelor's degree in computer science, Information Technology, or a related field.
• Minimum of 2 years of relevant experience in designing, deploying, and managing AWS cloud solutions.
• Strong proficiency in AWS services such as EC2, ELB, Auto Scaling, VPC, S3, RDS, and CloudFormation.
• Experience with other AWS services such as Lambda, ECS, EKS, and DynamoDB is a plus.
• Solid understanding of cloud computing principles, including IaaS, PaaS, and SaaS.
• Excellent problem-solving skills and the ability to troubleshoot complex issues in a cloud environment.
• Strong communication skills with the ability to collaborate effectively with cross-functional teams.
• Relevant AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer, etc.) are highly desirable.
Additional Information:
• We value creativity, innovation, and a proactive approach to problem-solving.
• We offer a collaborative and supportive work environment where your ideas and contributions are valued.
• Opportunities for professional growth and development. Someshwara Software Pvt Ltd is an equal opportunity employer.
We celebrate diversity and are dedicated to creating an inclusive environment for all employees.
FURNISHKA is in search of a Creative Content Specialist to join our innovative marketing team. As the Creative Content Specialist, you will play a pivotal role in crafting visually stunning graphic creatives and video reels to elevate FURNISHKA's online presence across platforms such as the website, Instagram, Facebook, and Google. Additionally, you will contribute to offline marketing campaigns and create captivating visuals for our retail stores.
Responsibilities:
1. Develop visually appealing graphic creatives for online platforms, including the web and social media channels.
2. Create engaging video reels to showcase FURNISHKA's products and brand identity.
3. Design offline marketing collateral, ensuring alignment with the overall brand aesthetic.
4. Collaborate closely with the marketing team to understand campaign objectives and deliver compelling creatives.
5. Ensure brand consistency across all visual content and marketing materials.
6. Contribute creative ideas to enhance FURNISHKA's visual storytelling.
Qualifications:
1. Proven experience in graphic design and video editing, with a strong portfolio showcasing your creative skills.
2. Proficiency in design software such as Adobe Creative Suite (Photoshop, Illustrator, Premiere Pro, etc.).
3. Understanding of social media trends and best practices for creating engaging content.
4. Ability to adapt design styles to suit various marketing channels and mediums.
5. Strong attention to detail and a keen eye for aesthetics.
Job Description:
- Bachelor's or Master's degree in Computer Science.
- Proven experience in leading a team of backend engineers
- Strong organizational and project management skills.
- Proficiency with fundamental front end languages such as HTML, CSS and JavaScript.
- Familiarity with Android development - kotlin, RX-java, design pattern.
- Proficiency with server-side languages such as Spring boot, Hibernate.
- Familiarity with database technology such as MySQL and MongoDB.
- Familiarity with cloud platforms(GCP).
- Experience with Agile/Scrum methodologies
- Excellent verbal communication skills.
- Good problem-solving skills.
- Attention to detail.
Experience: 8-10yrs
Notice Period: max 15days
Must-haves*
1. Knowledge about Database/NoSQL DB hosting fundamentals (RDS multi-AZ, DynamoDB, MongoDB, and such)
2. Knowledge of different storage platforms on AWS (EBS, EFS, FSx) - mounting persistent volumes with Docker Containers
3. In-depth knowledge of Security principles on AWS (WAF, DDoS, Security Groups, NACL's, IAM groups, and SSO)
4. Knowledge on CI/CD platforms is required (Jenkins, GitHub actions, etc.) - Migration of AWS Code pipelines to GitHub actions
5. Knowledge of vast variety of AWS services (SNS, SES, SQS, Athena, Kinesis, S3, ECS, EKS, etc.) is required
6. Knowledge on Infrastructure as Code tool is required We use Cloudformation. (Terraform is a plus), ideally, we would like to migrate to Terraform from CloudFormation
7. Setting CloudWatch Alarms and SMS/Email Slack alerts.
8. Some Knowledge on configuring any kind of monitoring tool such as Prometheus, Dynatrace, etc. (We currently use Datadog, CloudWatch)
9. Experience with any CDN provider configurations (Cloudflare, Fastly, or CloudFront)
10. Experience with either Python or Go scripting language.
11. Experience with Git branching strategy
12. Containers hosting knowledge on both Windows and Linux
The below list is *Nice to Have*
1. Integration experience with Code Quality tools (SonarQube, NetSparker, etc) with CI/CD
2. Kubernetes
3. CDN's other than CloudFront (Cloudflare, Fastly, etc)
4. Collaboration with multiple teams
5. GitOps
Driving design and innovation in the user-facing application to manage Yulus’ fleet, you will be working on Yulu Mobile Application which will include Maps, interaction with IoT devices via Bluetooth, and various other features. You will use your expertise in application development to evaluate and select development methods, processes, standard methodologies and tools. An eye for detail, Pixel perfection
and walking the extra mile to deliver a great user experience is essential.
Key Responsibilities
● Designing and building mobile applications for Apple’s iOS platform.
● Collaborating with the design team to define app features.
● Develop test specs and approaches for the application
● Investigate and resolve performance issues, and inefficiencies
● Ensuring quality and performance of the application to specifications.
● Identifying potential problems and resolving application bottlenecks.
● Fixing application bugs before the final release.
● Understand the market and participate in product roadmap discussions
Key Requirements
● Degree from a top engineering college, or equivalent technical background is preferred
● Agility and ability to adapt quickly to changing requirements, scope and priorities
● 2-4 years of industry experience in iOS Mobile Application design and development, with minimum 2 apps deployed in App Store
● A deep familiarity with Swift. Experience working with iOS frameworks such as Maps, core Location, core Bluetooth and Core Animation
● Strong UX/UI design exposure and experience in making apps work intuitively
● Ability to identify issues and improve application performance
● Experience in the usage of instruments to detect memory leaks for performance optimization
● Develop unit and functional test cases
● Familiar with the following – Git repository, Restful API, MVC, MVP, MVVM
● Strong CS fundamentals (with competencies in algorithms and data structures)
● Experience with third-party libraries and APIs Solid understanding of the full mobile development life cycle.
● Highly accountable and takes ownership, with a collaborative attitude, and a lifelong learner
Aerchain is an AI-powered Procurement platform transforming the way purchase is done in enterprises. We are a 2-year-old funded startup with some of the largest global companies as clients and are part of 3 top accelerator programs backed by Reliance Group, top VCs & Supply Chain experts. We are on the lookout for technology enthusiasts who would love to contribute to revolutionising the world's procurement industry with Artificial Intelligence!
Responsibilities
1. Define technical blueprints and take complete technical ownership of high-level design, tech stack of the product.
2. Design, build, test and deploy product features that are reliable, secure and optimised for performance across all environments.
3. Work closely with product management team to define and refine feature specifications.
4. Work with cross-functional teams to address technical dependencies.
5. Drive some of the company-wide tech initiatives striving towards continuous technical excellence of our platforms.
6. Mentor upcoming engineers in the organisation / team.
Qualifications
1. 4+ years of experience in developing dynamic, scalable web applications.
2. Demonstrated ability in building / architecting large scale products in the React ecosystem.
3. Ability to work with teams in a collaborative and productive manner.
4. Strong computer science fundamentals in data structures and algorithms.
5. Familiarity with cross-browser compatibility issues and demonstrates good design and UI / UX sensibilities.
6. Excellent technical leadership and mentoring skills.
Our Ideal Candidate
1. Loves writing and owning code and enjoys challenges and problem solving.
2. Self driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities.
3. Takes end-to-end ownership of the product, ideates on product requirements with the team and drives it from design to implementation.
4. Action-oriented and dreams big. More importantly believes and has the ability of “Making it Happen !”
Why Aerchain
1. Be part of the engineering leadership team - You’ll have the opportunity to work with cross-functional teams and take key technology decisions to build future-forward products.
2. Build a truly disruptive product - We are building a platform that transforms the way S2P cycles happen in enterprises. Using data at every step, we plan on automating processes starting with requirement gathering to payments and customers love us!
3. Be part of the high-growth challenge - In the process of scaling from 1-10, we are sure that there are very exciting and challenging times ahead. The experience of being a part of this growth phase is unique and will fast track your career. You’ll obviously learn a lot!






