11+ QA DB Jobs in Pune | QA DB Job openings in Pune
Apply to 11+ QA DB Jobs in Pune on CutShort.io. Explore the latest QA DB Job opportunities across top companies like Google, Amazon & Adobe.
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python
Position: General Cloud Automation Engineer/General Cloud Engineer
Location-Balewadi High Street,Pune
Key Responsibilities:
- Strategic Automation Leadership
- Drive automation to improve deployment speed and reduce manual work.
- Promote scalable, long-term automation solutions.
- Infrastructure as Code (IaC) & Configuration Management
- Develop IaC using Terraform, CloudFormation, Ansible.
- Maintain infrastructure via Ansible, Puppet, Chef.
- Scripting in Python, Bash, PowerShell, JavaScript, GoLang.
- CI/CD & Cloud Optimization
- Enhance CI/CD using Jenkins, GitHub Actions, GitLab CI/CD.
- Automate across AWS, Azure, GCP, focusing on performance, networking, and cost-efficiency.
- Integrate monitoring tools such as Prometheus, Grafana, Datadog, ELK.
- Security Automation
- Enforce security with tools like Vault, Snyk, Prisma Cloud.
- Implement automated compliance and access controls.
- Innovation & Continuous Improvement
- Evaluate and adopt emerging automation tools.
- Foster a forward-thinking automation culture.
Required Skills & Tools:
Strong background in automation, DevOps, and cloud engineering.
Expert in:
IaC: Terraform, CloudFormation, Azure ARM, Bicep
Config Mgmt: Ansible, Puppet, Chef
Cloud Platforms: AWS, Azure, GCP
CI/CD: Jenkins, GitHub Actions, GitLab CI/CD
Scripting: Python, Bash, PowerShell, JavaScript, GoLang
Monitoring & Security: Prometheus, Grafana, ELK, Vault, Prisma Cloud
Network Automation: Private Endpoints, Transit Gateways, Firewalls, etc.
Certifications Preferred:
AWS DevOps Engineer
Terraform Associate
Red Hat Certified Engineer
Details:
Role: Sales Development Representative (SDR) – India
Location: Pune (In-office only)
Experience: Minimum 6 months
Type: Full-time
Must have
- Minimum 6 months of experience in sales development, inside sales, or similar outbound sales roles
- Strong resilience and comfort with rejection
- Excellent communication skills – clear, confident, and persuasive
- Coachability and a hunger to learn and iterate
- Strong organizational skills with the ability to manage outreach pipelines independently
- Willingness to work in-office from Pune
Good to have
- Experience in B2B SaaS or agency sales
- Familiarity with Indian SMBs/startups ecosystem
- Familiarity with consultative selling techniques
- Knowledge of sales automation tools
- Exposure to account-based selling
- Prior experience in content marketing or digital services is a huge plus
What You'll Be Doing
- Researching accounts & decision-makers in India’s B2B space
- Running outbound campaigns across calls, emails, and LinkedIn
- Owning outreach sequences to book qualified discovery calls
- Building and nurturing relationships with potential B2B clients
- Updating CRM with activities and ensuring follow-ups
- Identifying upsell opportunities and setting the stage for long-term account growth
- Meeting monthly quotas for qualified meetings booked
Skills required for the role
- Outbound sales
- Lead generation
- Cold calling
- Cold emailing
- LinkedIn outreach
- Comfortable with CRM tools
- Communication (verbal & written)
- Prospecting & qualification
- Sales Pipeline management
- Sales cadence management
Note: This is an in-office role in Pune, India.
About Us
Wittypen is a managed marketplace for content and we work with some of the best brands like Freshworks, Swiggy, Acko, Paytm, and others to help create content through our pool of 1700+ freelance writers.
Founded in 2015, today Wittypen is one of the most credible content platforms working with customers across 5+ countries and creating thousands of content pieces every month.
We believe in having a goal-driven culture where our colleagues try to do the best work of their lives in a way that also drives meaning and impact.
Benefits
- 5-day work week.
- Choose any 8 hours between 10am-8pm.
- Play a crucial role in shaping the brand of Wittypen.
- Opportunities to develop your skills and grow your career in a company that values content marketing.
- Be part of a creative team that encourages innovation and collaboration.
- Make a significant impact on the brand's visibility and growth.
- Join a workplace that values diversity and provides a supportive environment.
-
Preferred Education & Experience:
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics or related technical field or equivalent practical experience. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 5+ years of hands-on demonstrable experience with:
▪ Object Oriented Modeling, Design, & Programming
▪ Microservices Architecture, API Design, & Implementation
▪ Relational, Document, & Graph Data Modeling, Design, & Implementation -
Well-versed in and hands-on demonstrable experience with:
▪ Stream & Batch Big Data Pipeline Processing
▪ Distributed Cloud Native Computing
▪ Serverless Computing & Cloud Functions -
5+ years of hands-on development experience in Java programming.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Spring Boot, Apache Camel, Akka, etc.;
extra points if you can demonstrate your knowledge with working examples.
2+ years of hands-on development experience in one or more Relational and NoSQL datastores such as Amazon S3, Amazon DocumentDB, Amazon Elasticsearch Service, Amazon Aurora, AWS DynamoDB, Amazon Athena, etc. -
2+ years of hands-on development experience in one or more technologies such as Amazon Simple Queue Service, Amazon Kinesis, Apache Kafka, AWS Lambda, AWS Batch, AWS Glue, AWS Step Functions, Amazon API Gateway, etc.
-
2+ years of hands-on development experience in one or more technologies such as AWS Developer Tools, AWS Management & Governance, AWS Networking and Content Delivery, AWS Security, Identity, and Compliance, etc.
-
Well-versed in Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
-
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed with Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Experience : 5+Years
-
Job Location : Remote/Pune
Job Summary: We are seeking a talented and motivated Software Developer with 1-4 years of experience to join in Pune. The ideal candidate will be proficient in WPF (Windows Presentation Foundation) and C#, with a strong understanding of modern software development principles and architectural patterns. You will be responsible for designing, developing, and maintaining robust and scalable desktop applications, contributing to the full software development lifecycle within an agile team.
Key Responsibilities:
- Design, develop, test, and deploy high-quality desktop applications using WPF, XAML, and C#.
- Implement and maintain application logic following established architectural patterns such as MVVM (Model-View-ViewModel) and MVC (Model-View-Controller).
- Utilize threading concepts effectively to ensure responsive and performant user interfaces.
- Work with various database technologies to store and retrieve application data efficiently.
- Integrate with internal and external APIs to extend application functionality.
- Apply strong Object-Oriented Programming (OOP) principles in all development activities.
- Collaborate closely with product owners, UI/UX designers, and other developers to translate requirements into technical specifications and deliver effective solutions.
- Participate actively in all phases of the Software Development Life Cycle (SDLC), including requirements gathering, design, development, testing, deployment, and support.
- Adhere to Agile methodologies (Scrum/Kanban) to ensure timely delivery and continuous improvement.
- Contribute to code reviews, ensuring code quality, maintainability, and adherence to coding standards.
- Troubleshoot and debug issues, providing timely resolutions and maintaining application stability.
- Stay updated with the latest industry trends and technologies related to WPF and desktop application development.
Required Technical Skill Set:
- Must-Have Experience:
- WPF (Windows Presentation Foundation) and XAML: Demonstrated expertise in building complex and user-friendly desktop applications.
- C#: Strong proficiency in C# programming language, including .NET Framework or .NET Core.
- Threading Concepts: Solid understanding and practical experience with multi-threading and asynchronous programming to create responsive applications.
- Database: Experience with relational databases (e.g., SQL Server, MySQL, PostgreSQL) and ORM frameworks (e.g., Entity Framework).
- API Integration: Experience consuming and integrating with RESTful APIs.
- MVVM (Model-View-ViewModel): In-depth understanding and practical application of the MVVM architectural pattern.
- MVC (Model-View-Controller): Familiarity with the MVC architectural pattern.
- Object-Oriented Programming (OOP): Excellent grasp of OOP principles (Encapsulation, Inheritance, Polymorphism, Abstraction) and design patterns.
- Good to Have Experience:
- Windows Canvas / User Document: Experience with advanced UI elements and document handling in WPF.
- Web Programming (ASP.NET): Basic understanding or experience with ASP.NET for potential full-stack awareness.
- Task Management: Experience with task management tools (e.g., JIRA, Azure DevOps).
- Fast-paced Team Environment: Proven ability to thrive and deliver in a dynamic and fast-paced team setting.
- Agile Methodology / SDLC: Practical experience working in an Agile/Scrum environment and a strong understanding of the Software Development Life Cycle.
Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 1 to 4 years of hands-on experience in WPF desktop application development.
Desired Competencies (Technical/Behavioral Competency)
- . Strong knowledge of Splunk architecture, components, and deployment models (standalone, distributed, or clustered)
- Hands-on experience with Splunk forwarders, search processing, and index clustering
- Proficiency in writing SPL (Search Processing Language) queries and creating dashboards
- Familiarity with Linux/Unix systems and basic scripting (e.g., Bash, Python)
- Understanding of networking concepts and protocols (TCP/IP, syslog)
Key Responsibilities
- Deploy Splunk Enterprise or Splunk Cloud on servers or virtual environments.
- Configure indexing and search head clusters for data collection and search functionalities.
- Deploy universal or heavy forwarders to collect data from various sources and send it to the Splunk environment
- Configure data inputs (e.g., syslogs, snmp, file monitoring) and outputs (e.g., storage, dashboards)
- Identify and onboard data sources such as logs, metrics, and events.
- Use regular expressions or predefined methods to extract fields from raw data
- Configure props.conf and transforms.conf for data parsing and enrichment.
- Create and manage indexes to organize and control data storage.
- Configure roles and users with appropriate permissions using role-based access control (RBAC).
- Integrate Splunk with external authentication systems like LDAP, SAML, or Active Directory
- Monitor user activities and changes to the Splunk environment
- Optimize Splunk for better search performance and resource utilization
- Regularly monitor the status of indexers, search heads, and forwarders
- Configure backups for configurations and indexed data
- Diagnose and resolve issues like data ingestion failures, search slowness, or system errors.
- Install and manage apps and add-ons from Splunkbase or custom-built solutions.
- Create python scripts for automation and advanced data processing.
- Integrate Splunk with ITSM tools (e.g., ServiceNow), monitoring tools, or CI/CD pipelines.
- Use Splunk's REST API for automation and custom integrations
- Good to have Splunk Core Certified Admin certification
Splunk Development and Administration
- Build and optimize complex SPL (Search Processing Language) queries for dashboards, reports, and alerts.
- Develop and manage Splunk apps and add-ons, including custom Python scripts for data ingestion and enrichment.
- Onboard and validate data sources in Splunk, ensuring proper parsing, indexing, and field extractions.
Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.
You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.
Required Experience:
- Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
- Have experience working and strong understanding of object-oriented programing and cloud technologies
- End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
- Strong experience with unit and integration testing of the Spring Boot APIs.
- Strong understanding and production experience of RESTful API's and microservice architecture.
- Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.
Nice to have's (but not required):
- Exposure to Kotlin or other JVM programming languages
- Strong understanding and production experience working with Docker container environments
- Strong understanding and production experience working with Kafka
- Cloud Environments: AWS, GCP or Azure
Ask any CIO about corporate data and they’ll happily share all the work they’ve done to make their databases secure and compliant. Ask them about other sensitive information, like contracts, financial documents, and source code, and you’ll probably get a much less confident response. Few organizations have any insight into business-critical information stored in unstructured data.
There was a time when that didn’t matter. Those days are gone. Data is now accessible, copious, and dispersed, and it includes an alarming amount of business-critical information. It’s a target for both cybercriminals and regulators but securing it is incredibly difficult. It’s the data challenge of our generation.
Existing approaches aren’t doing the job. Keyword searches produce a bewildering array of possibly relevant documents that may or may not be business critical. Asking users to categorize documents requires extensive training and constant vigilance to make sure users are doing their part. What’s needed is an autonomous solution that can find and assess risk so you can secure your unstructured data wherever it lives.
That’s our mission. Concentric’s semantic intelligence solution reveals the meaning in your structured and unstructured data so you can fight off data loss and meet compliance and privacy mandates.
Check out our core cultural values and behavioural tenets here: https://concentric.ai/the-concentric-tenets-daily-behavior-to-aspire-to/" target="_blank">https://concentric.ai/the-concentric-tenets-daily-behavior-to-aspire-to/
Title: Cloud DevOps Engineer
Role: Individual Contributor (4-8 yrs)
Requirements:
- Energetic self-starter, a fast learner, with a desire to work in a startup environment
- Experience working with Public Clouds like AWS
- Operating and Monitoring cloud infrastructure on AWS.
- Primary focus on building, implementing and managing operational support
- Design, Develop and Troubleshoot Automation scripts (Configuration/Infrastructure as code or others) for Managing Infrastructure.
- Expert at one of the scripting languages – Python, shell, etc
- Experience with Nginx/HAProxy, ELK Stack, Ansible, Terraform, Prometheus-Grafana stack, etc
- Handling load monitoring, capacity planning, and services monitoring.
- Proven experience With CICD Pipelines and Handling Database Upgrade Related Issues.
- Good Understanding and experience in working with Containerized environments like Kubernetes and Datastores like Cassandra, Elasticsearch, MongoDB, etc
Job Responsibilities:
Support, maintain, and enhance existing and new product functionality for trading software in a real-time, multi-threaded, multi-tier server architecture environment to create high and low level design for concurrent high throughput, low latency software architecture.
- Provide software development plans that meet future needs of clients and markets
- Evolve the new software platform and architecture by introducing new components and integrating them with existing ones
- Perform memory, cpu and resource management
- Analyze stack traces, memory profiles and production incident reports from traders and support teams
- Propose fixes, and enhancements to existing trading systems
- Adhere to release and sprint planning with the Quality Assurance Group and Project Management
- Work on a team building new solutions based on requirements and features
- Attend and participate in daily scrum meetings
Required Skills:
- JavaScript and Python
- Multi-threaded browser and server applications
- Amazon Web Services (AWS)
- REST
CricStox is a Pune startup building a trading solution in the realm of gametech x fintech.
We intend to build a sport-agnostic platform to allow trading in stocks of sportspersons under any sport
through our mobile & web-based applications.
We’re currently hiring a Frontend Engineer who will gather, refine specifications and requirements based
on technical needs and implement the same by using best software development practices.
Responsibilities?
● Mainly, but not limited to maintaining, expanding, and scaling our microservices/ app/ site.
● Integrate data from various back-end services and databases.
● Always be plugged into emerging technologies/industry trends and apply them into operations and
activities.
● Comfortably work and thrive in a fast-paced environment, learn rapidly and master diverse web
technologies and techniques.
● Juggle multiple tasks within the constraints of timelines and budgets with business acumen.
What skills do I need?
● Excellent programming skills and in-depth knowledge of modern HTML5, CSS3 (including
preprocessors like SASS).
● Excellent programming skills in Javascript or Typescript.
● Basic understanding in Nodejs with Nest framework or equivalent.
● Good programming skills in Vue 3.x with Composition API.
● Good understanding of using CSS frameworks like Quasar, Tailwind, etc.
● A solid understanding of how web applications work including security, session management, and
best development practices.
● Adequate knowledge of database systems, OOPs and web application development.
● Good functional understanding of containerising applications using Docker.
● Basic understanding of how cloud infrastructures like AWS, GCP work.
● Basic understanding of setting up Github CI/CD pipeline to automate Docker images building,
pushing to AWS ECR & deploying to the cluster.
● Proficient understanding of code versioning tools, such as Git (or equivalent).
● Proficient understanding of Agile methodology.
● Hands-on experience with network diagnostics, monitoring and network analytics tools.
● Basic knowledge of Search Engine Optimization process.
● Aggressive problem diagnosis and creative problem solving skills.






