11+ Acquisition Jobs in Hyderabad | Acquisition Job openings in Hyderabad
Apply to 11+ Acquisition Jobs in Hyderabad on CutShort.io. Explore the latest Acquisition Job opportunities across top companies like Google, Amazon & Adobe.
Dear Candidate,
We are urgently Hiring QA-Automation Testers for Hyderabad Location.
Position: QA Tester-Automation
Location: Hyderabad
Experience: 5-8 yrs
Salary: Best in Industry (20-25% Hike on the current ctc)
Note:
only Immediate to 15 days Joiners will be preferred.
Candidates from Tier 1 companies will only be shortlisted and selected
Candidates' NP more than 30 days will get rejected while screening.
Offer shoppers will be rejected.
JD:
Role: Automation QA Engineer
Requirements:
- Test automation experience
- Strong knowledge of OOP (understanding OOP in automation)
- Excellent knowledge of automation testing principles
- Excellent knowledge of web-based application structure
- Expert in API testing
- Experience in UI testing
- Wide experience in developing automation testing frameworks
- Hands-on experience with Cypress and Playwright
- CI knowledge and experience (Azure DevOps)
- Good theoretical knowledge of testing levels and types
- Test design techniques knowledge
- Deep understanding of SDLC (SCRUM, Kanban)
- Knowledge and experience in reviewing requirements
- Problem-solving skills
- Databases general knowledge (SQL)
Best Regards,
Minakshi Soni
Executive - Talent Acquisition (L2)
Rigel Networks
Worldwide Locations: USA | HK | IN
Job Title: Data Engineer
Location: Hyderabad
About us:
Blurgs AI is a deep-tech startup focused on maritime and defence data-intelligence solutions, specialising in multi-modal sensor fusion and data correlation. Our flagship product, Trident, provides advanced domain awareness for maritime, defence, and commercial sectors by integrating data from various sensors like AIS, Radar, SAR, and EO/IR.
At Blurgs AI, we foster a collaborative, innovative, and growth-driven culture. Our team is passionate about solving real-world challenges, and we prioritise an open, inclusive work environment where creativity and problem-solving thrive. We encourage new hires to bring their ideas to the table, offering opportunities for personal growth, skill development, and the chance to work on cutting-edge technology that impacts global defence and maritime operations.
Join us to be part of a team that's shaping the future of technology in a fast-paced, dynamic industry.
Job Summary:
We are looking for a Senior Data Engineer to design, build, and maintain a robust, scalable on-premise data infrastructure. You will focus on real-time and batch data processing using platforms such as Apache Pulsar and Apache Flink, work with NoSQL databases like MongoDB and ClickHouse, and deploy services using containerization technologies like Docker and Kubernetes. This role is ideal for engineers with strong systems knowledge, deep backend data experience, and a passion for building efficient, low-latency data pipelines in a non-cloud, on-prem environment.
Key Responsibilities:
- Data Pipeline & Streaming Development
- Design and implement real-time data pipelines using Apache Pulsar and Apache Flink to support mission-critical systems.
- Develop high-throughput, low-latency data ingestion and processing workflows across streaming and batch workloads.
- Integrate internal systems and external data sources into a unified on-prem data platform.
- Data Storage & Modelling
- Design efficient data models for MongoDB, ClickHouse, and other on-prem databases to support analytical and operational workloads.
- Optimise storage formats, indexing strategies, and partitioning schemes for performance and scalability.
- Infrastructure & Containerization
- Deploy, manage, and monitor containerised data services using Docker and Kubernetes in on-prem environments.
- Performance, Monitoring & Reliability
- Monitor the performance of streaming jobs and database queries; fine-tune for efficiency and reliability.
- Implement robust logging, metrics, and alerting solutions to ensure data system availability and uptime.
- Identify bottlenecks in the pipeline and proactively implement optimisations.
Required Skills & Experience:
- Strong experience in data engineering with a focus on on-premise infrastructure.
- Strong expertise in streaming technologies like Apache Pulsar, Apache Flink, or similar.
- Deep experience with MongoDB, ClickHouse, and other NoSQL or columnar storage databases.
- Proficient in Python, Java, or Scala for data processing and backend development.
- Hands-on experience deploying and managing systems using Docker and Kubernetes.
- Familiarity with Linux-based systems, system tuning, and resource monitoring.
Preferred Qualifications:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or an equivalent combination of education and experience.
Additional Responsibilities for Senior Data Engineers :
For those hired as Senior Data Engineers, the role will come with added responsibilities, including:
- Leadership & Mentorship: Guide and mentor junior engineers, sharing expertise and best practices.
- System Architecture: Lead the design and optimization of complex real-time and batch data pipelines, ensuring scalability and performance.
- Sensor Data Expertise: Focus on building and optimizing sensor-based data pipelines and stateful stream processing for mission-critical applications in domains like maritime and defense.
- End-to-End Ownership: Take responsibility for the performance, reliability, and optimization of data systems.
Compensation:
- Data Engineer CTC: 4 - 8 LPA
- Senior Data Engineer CTC: 12 - 16 LPA
About OOLIO:
Founded in 2019, Oolio has rapidly grown into Australia’s largest hospitality tech provider, trusted by industry giants like Star Casinos. Our innovative solutions empower businesses to streamline operations, enhance guest experiences, and drive revenue growth. With a global footprint, we have established offices in the UK, US, and New Zealand, delivering cutting-edge technology to some of the world's most dynamic hospitality brands. At Oolio, we’re not just building software—we’re shaping the future of hospitality with innovation, agility, and a passion for excellence.
What You’ll Do:
- Lead AI-driven workflow enablement (Cursor, Devin, Windsurf, etc.)
- Own developer velocity — from CI fast-tracking to dev environment tooling (Nix)
- Optimize build and test cycles for our React codebases
- Define and enforce high standards for unit and E2E testing
- Drive open API design and alignment across internal and external teams
- Write sharp RFCs, lead cross-functional architecture discussions, and ship high-leverage tooling
- Make pragmatic, cost-conscious engineering decisions that balance velocity and scale
What You Bring:
- 6+ years of software development experience – primarily with product-based (SaaS) companies
- 5-7 years of deep experience with modern React front end – building highly scalable features from scratch
- Experience working in a fast-paced environment and startups is a plus
- Strong command of CI/CD, dev environment tooling, and reproducible builds
- Production-level Kubernetes experience
- Hands-on with AI development tools (Cursor, Devin, etc.)
- Strong belief in open API specs and clean contracts between systems
- Demonstrated open-source contributions
- Systems thinker who writes, documents, and communicates with clarity
Key Benefits:
- Employee Health Insurance Plan up to ₹3L
- Annual Paid Holidays
- Annual Team Meetups
- Performance Bonus
- Flexible Work Environment
Job Description – Senior .NET Developer (Angular/React)
Job Title: Senior .NET Developer (Angular/React)
Location: Hyderabad (Hybrid)
Job Summary: We are seeking a Senior .NET Developer with strong frontend expertise in Angular or React to join our dynamic team. The ideal candidate will have deep experience in .NET Core, C#, Web API, relational/non-relational databases, and cloud platforms (Azure/AWS) while also being skilled in modern frontend development. This role requires designing and developing scalable, high performance, and secure applications with a strong focus on frontend and backend integration.
Key Responsibilities:
Develop, test, and maintain .NET Core applications and Web APIs.
Build dynamic and responsive UI components using Angular or React.
Ensure seamless frontend-backend integration with RESTful APIs.
Design and implement scalable, secure, and high-performance applications.
Work with relational (SQL Server, PostgreSQL, MySQL) and non-relational (MongoDB, DynamoDB) databases for efficient data management.
Apply OOP principles, SOLID design, and software development best practices.
Utilize Azure/AWS cloud services for deployment and management. Ensure code quality through unit testing, debugging, and performance optimization.
Implement and manage Git for version control.
Collaborate with cross-functional teams to define, design, and deliver new features.
Troubleshoot and resolve technical issues to maintain application stability.
Required Qualifications & Skills:
4-8 years of hands-on experience in .NET development.
Strong proficiency in .NET Core, C#, Web API.
Must have experience with frontend technologies—Angular or React. Solid understanding of RESTful services, microservices architecture, and SOA.
Experience with relational and non-relational databases (SQL Server, PostgreSQL, MySQL, MongoDB, DynamoDB).
Hands-on experience with cloud platforms (Azure/AWS) and DevOps practices.
Strong debugging, problem-solving, and troubleshooting skills.
Familiarity with CI/CD pipelines, Docker/Kubernetes is a plus.
Bachelor's degree in computer science, Software Engineering, or a related field.
Who we are looking for
A candidate who can design, build and configure applications to meet business process and application requirements.
Key Role Requirements:
- Experience in Java-J2EE Development
- Hands-on experience in Struts (Core Java, JSP, Servlets, EJB) framework.
- Hands-on experience in Oracle SQL, Procedures
- Hands-on experience in Spring boot, Spring batch
- Should have worked on application servers like IBM WAS / Jboss / Web logic.
- Knowledge of basic Linux commands and BIRT reporting tool is an added advantage.
- Knowledge in SVN or any other version control is an added advantage.
- Knowledge in Build tool like ANT, Maven is an added advantage
- Strong problem solving and analytical capabilities.
Where: Hyderabad/ Bengaluru, India (Hybrid Mode 3 Days/Week in Office)
Job Description:
- Collaborate with stakeholders to develop a data strategy that meets enterprise needs and industry requirements.
- Create an inventory of the data necessary to build and implement a data architecture.
- Envision data pipelines and how data will flow through the data landscape.
- Evaluate current data management technologies and what additional tools are needed.
- Determine upgrades and improvements to current data architectures.
- Design, document, build and implement database architectures and applications. Should have hands-on experience in building high scale OLAP systems.
- Build data models for database structures, analytics, and use cases.
- Develop and enforce database development standards with solid DB/ Query optimizations capabilities.
- Integrate new systems and functions like security, performance, scalability, governance, reliability, and data recovery.
- Research new opportunities and create methods to acquire data.
- Develop measures that ensure data accuracy, integrity, and accessibility.
- Continually monitor, refine, and report data management system performance.
Required Qualifications and Skillset:
- Extensive knowledge of Azure, GCP clouds, and DataOps Data Eco-System (super strong in one of the two clouds and satisfactory in the other one)
- Hands-on expertise in systems like Snowflake, Synapse, SQL DW, BigQuery, and Cosmos DB. (Expertise in any 3 is a must)
- Azure Data Factory, Dataiku, Fivetran, Google Cloud Dataflow (Any 2)
- Hands-on experience in working with services/technologies like - Apache Airflow, Cloud Composer, Oozie, Azure Data Factory, and Cloud Data Fusion (Expertise in any 2 is required)
- Well-versed with Data services, integration, ingestion, ELT/ETL, Data Governance, Security, and Meta-driven Development.
- Expertise in RDBMS (relational database management system) – writing complex SQL logic, DB/Query optimization, Data Modelling, and managing high data volume for mission-critical applications.
- Strong grip on programming using Python and PySpark.
- Clear understanding of data best practices prevailing in the industry.
- Preference to candidates having Azure or GCP architect certification. (Either of the two would suffice)
- Strong networking and data security experience.
Awareness of the Following:
- Application development understanding (Full Stack)
- Experience on open-source tools like Kafka, Spark, Splunk, Superset, etc.
- Good understanding of Analytics Platform Landscape that includes AI/ML
- Experience in any Data Visualization tool like PowerBI / Tableau / Qlik /QuickSight etc.
About Us
Gramener is a design-led data science company. We build custom Data & AI solutions that help solve complex business problems with actionable insights and compelling data stories. We partner with enterprise data and digital transformation teams to improve the data-driven decision-making culture across the organization. Our open standard low-code platform, Gramex, rapidly builds engaging Data & AI solutions across multiple business verticals and use cases. Our solutions and technology have been recognized by analysts such as Gartner and Forrester and have won several awards.
We Offer You:
- a chance to try new things & take risks.
- meaningful problems you'll be proud to solve.
- people you will be comfortable working with.
- transparent and innovative work environment.
To know more about us visit Gramener Website and Gramener Blog.
If anyone looking for the same, kindly share below mentioned details.
Total Experience:
Relevant Experience:
Notice Period:
CTCT:
ECTC:
Current Location:
- End-to-end full-stack development experience
- Full hands-on experience with at least one of the following languages Java, PHP, Python, .NET and code repositories like GIT, SVN
- Expertise with HTML5, JQuery, CSS
- Proficiency with front-end JavaScript frameworks like Angular, React, etc.
- Experience of designing and developing APIs in a micro-service architecture
- Experience with webserver technologies like Node.js, J2EE, Apache etc.
- Good understanding and working experience with either relational or non-relational databases like Oracle, MySQL, PostgreSQL, Mongo DB, Cassandra
- Good understanding of the changing trends of user interface guidelines for mobile apps and be able to transform apps to newer form factors
- Hands-on with Native UI Components
- Familiar with Cloud components & deployment procedures.
- Familiar with code repository
Role: SW Project Manager
Location: Hyderabad
Fulltime
Experience: Should be from Automotive SW Background
Must Have:
- Experience of working in any of the commercial AUTOSAR stack (EB/Vector/KPIT/Mentor/Arccore etc.)
- Very good problem resolution and debugging skills.
- Very good communication and cross cultural skills.
- Experience in project planning, estimation, budget control, release management
- Very good knowledge in ASPICE
Good To Have:
- Experience in migrating base software from legacy to AUTOSAR architecture
- Experience in configuration of different AUTOSAR BSW components (Comms/DCM/DEM/Mem/OS/RTE)
- Experience of working in Vector AUTOSAR stack.
- Experience in Infineon multicore processors - 2xx series.
Job Roles and Responsibilities:
- Manage a team of engineers in the role of local software project manager
- Responsible for software delivery out of India Site.
- Follow ASPICE process (develop SRS, SDD, SW Construction, Software Unit Verification) before delivering software to OEM
- Travel to various ZF locations across the globe based on need.
Role: Functions Developer (Embedded c - Algorithm / Driving Functions Development)
Location: Hyderabad
Fulltime
Job Description:
- Design and development of automotive feature/function software/components (ACC, AEB, TSR, LKA etc.) for ADAS/AD systems
- Coordination and regular interaction with different stakeholders and teams like testing, requirements, leads etc.
- Participate in SW requirement generation, SW architecture, detailed design etc.
Requirement:
- 3-7 years of experience in development of Algorithm & Functions for advance driver assist systems (ADAS), Autonomous driving (AD)
- Development experience with safety critical systems
- Experienced in development using MATLAB Simulink, TargetLink, Stateflow
- Experience in modelling and validation of control systems
- Knowledge of SIL, Performance Test, Functional testing
- Embedded software development using C, C++
- Issue management and version control
- Knowledge of ASPICE processes, Static analysis, MISRA checks etc.
- Strong written and verbal communication skills
- Proactive approach for problem solving
Good to have:
- Knowledge of ADAS/AD functions (ACC, TSR, AEB, LCA etc), Data Analysis
- Experienced in managing and authoring of function specification requirements
- Familiarity with AUTOSAR RTE
Nice to have:
- AUTOSAR, Functional Safety (ISO26262) exposure
- Scripting Knowledge - Python, MATLAB
- Working knowledge of automotive protocols like CAN, Ethernet etc.
Thanks,
Satish,
2. Daily report to the reporting manager
3. Should be able to work in agile methodology
4. Should be confortable with State Management in Flutter
5. Should be well versed with programming concepts






