

Wohlig Transformations Pvt Ltd
https://wohlig.comAbout
Wohlig is the catalyst that transforms the clients perspective of technology and empowers them to take advantage of the digital revolution that is constantly evolving.
We simply want to help our clients with our expertise to make their business efficient, cutting- edge and have a low cost of ownership.
Photos
Connect with the team
Jobs at Wohlig Transformations Pvt Ltd

Job Overview
Architect and build scalable, high-performance backend systems while working on mission-critical platforms that process real-time market data and portfolio analytics. The role also involves leveraging Generative AI capabilities to enhance data intelligence, automation, and user-facing features, while ensuring regulatory compliance and secure financial transactions.
Key Responsibilities
- Design, develop, and maintain scalable backend services and APIs using NodeJS and Python
- Build event-driven architectures using RabbitMQ and Kafka for real-time data processing
- Develop and manage data pipelines integrating PostgreSQL and BigQuery for analytics and warehousing
- Integrate and deploy Generative AI models (LLMs, embeddings, AI APIs) into backend systems for automation, insights, and intelligent workflows
- Design AI-powered features such as recommendation systems, document processing, or conversational interfaces
- Ensure system reliability, security, and low-latency performance for mission-critical systems
- Lead technical design discussions, conduct code reviews, and mentor junior engineers
- Optimize database queries, implement caching strategies, and improve overall system performance
- Collaborate with cross-functional teams to deliver end-to-end product features
- Implement monitoring, logging, and observability solutions
Required Skills and Qualifications
- 2+ years of professional backend development experience
- Strong expertise in NodeJS and Python for production-grade applications
- Proven experience building RESTful APIs and microservices architectures
- Experience working with Generative AI frameworks/APIs (OpenAI, LangChain, vector databases, prompt engineering)
- Understanding of integrating LLMs into production systems (RAG, embeddings, fine-tuning basics)
- Strong proficiency in PostgreSQL, including query optimization and schema design
- Hands-on experience with RabbitMQ and Kafka
- Experience with BigQuery or similar data warehousing solutions
- Solid understanding of distributed systems, scalability patterns, and high-traffic applications
- Strong knowledge of authentication, authorization, and security best practices
- Experience with Git, CI/CD pipelines, and modern development workflows
- Excellent problem-solving and debugging skills
- Exposure to fintech or financial services, cloud platforms (GCP/AWS/Azure), Docker/Kubernetes, caching tools (Redis/Memcached), and regulatory requirements (KYC, compliance, data privacy) is a plus
Apply directly at: https://wohlig.keka.com/careers/jobdetails/136351

Lightning Job By Cutshort ⚡
As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)
Job Overview:
We are seeking an experienced DevOps Engineer to join our team. The successful candidate will be responsible for designing, implementing, and maintaining the infrastructure and software systems required to support our development and production environments. The ideal candidate should have a strong background in Linux, GitHub, Actions/Jenkins, ArgoCD, AWS, Kubernetes, Helm, Datadog, MongoDB, Envoy Proxy, Cert-Manager, Terraform, ELK, Cloudflare, and BigRock.
Kindly apply at https://wohlig.keka.com/careers/jobdetails/54566
Responsibilities:
• Design, implement and maintain CI/CD pipelines using GitHub, Actions/Jenkins, Kubernetes, Helm, and ArgoCD.
• Deploy and manage Kubernetes clusters using AWS.
• Configure and maintain Envoy Proxy and Cert-Manager to automate deployment and manage application environments.
• Monitor system performance using Datadog, ELK, and Cloudflare tools.
• Automate infrastructure management and maintenance tasks using Terraform, Ansible, or similar tools.
• Collaborate with development teams to design, implement and test infrastructure changes.
• Troubleshoot and resolve infrastructure issues as they arise.
• Participate in on-call rotation and provide support for production issues.
Qualifications:
• Bachelor's or Master's degree in Computer Science, Engineering or a related field.
• 3+ years of experience in DevOps engineering with a focus on Linux, GitHub, Actions/CodeFresh, ArgoCD, AWS, Kubernetes, Helm, Datadog, MongoDB, Envoy Proxy, Cert-Manager, Terraform, ELK, Cloudflare, and BigRock.
• Strong understanding of Linux administration and shell scripting.
• Experience with automation tools such as Terraform, Ansible, or similar.
• Ability to write infrastructure as code using tools such as Terraform, Ansible, or similar.
• Experience with container orchestration platforms such as Kubernetes.
• Familiarity with container technologies such as Docker.
• Experience with cloud providers such as AWS.
• Experience with monitoring tools such as Datadog and ELK.
Skills:
• Strong analytical and problem-solving skills.
• Excellent communication and collaboration skills.
• Ability to work independently or in a team environment.
• Strong attention to detail.
• Ability to learn and apply new technologies quickly.
• Ability to work in a fast-paced and dynamic environment.
• Strong understanding of DevOps principles and methodologies.
Description :
Experience : 3+ Years
Job Type : Full-time
Location : Sion, Mumbai (On-site)
Also Apply at : https://wohlig.keka.com/careers/jobdetails/122580
The Opportunity :
We are a global technology consultancy driving large-scale digital transformations for the Fortune 500. As a strategic partner to Google, we help enterprise clients navigate complex data landscapes migrating legacy systems to the cloud, optimizing costs, and turning raw data into executive-level insights.
We are seeking a Data Analyst who acts less like a technician and more like a Data Consultant. You will blend deep technical expertise in SQL and ETL with the soft skills required to tell compelling data stories to non-technical stakeholders.
What You Will Do :
- Strategic Cloud Data Architecture : Lead high-impact data migration projects. You will assess a client's legacy infrastructure and design the logic to move it to the cloud efficiently (focusing on scalability and security).
- Cost and Performance Optimization : Audit and optimize cloud data warehouses (e.g., BigQuery, Snowflake, Redshift). You will use logical reasoning to hunt down inefficiencies, optimize queries, and restructure data models to save clients significant operational costs.
- End-to-End Data Pipelines (ETL/ELT) : Build and maintain robust data pipelines. Whether its streaming or batch processing, you will ensure data flows seamlessly from source to dashboard using modern frameworks.
- Data Storytelling & Visualization : This is critical. You will build dashboards (Looker, Tableau, PowerBI) that don't just show numbers but answer business questions. You must be able to present these findings to C-suite clients with clarity and confidence.
- Advanced Analytics : Apply statistical rigor and logical deduction to solve unstructured business problems (e.g., Why is our user retention dropping?).
What We Are Looking For :
1. Core Competencies :
- Logical Reasoning : You possess a deductive mindset. You can break down ambiguous client problems into solvable technical components without needing hand-holding.
- Advanced SQL Mastery : You are fluent in complex SQL (Window functions, CTEs, stored procedures) and understand how to write query-efficient code for massive datasets.
- Communication Skills : You are an articulate storyteller who can bridge the gap between engineering jargon and business value.
2. Technical Experience :
- Cloud Proficiency : 3+ years of experience working within a major public cloud ecosystem (GCP, AWS, or Azure). Note : While we primarily use Google Cloud (BigQuery, Looker), we value strong architectural fundamentals over specific tool knowledge.
- Data Engineering & ETL : Experience with big data processing tools (e.g., Spark, Apache Beam, Databricks, or cloud-native equivalents).
- Visualization : Proven mastery of at least one enterprise BI tool (Looker, Tableau, PowerBI, Qlik).
Why Join us ?
- Cross-Cloud Exposure : While our current focus is GCP, your foundational skills will be challenged and expanded across various tech stacks.
- Client Impact : You aren't just writing code in a back room; you are the face of our data capability, directly advising clients on how to run their businesses better.
- Growth : We offer a structured career path with sponsorship for cloud certifications (Google Professional Data Engineer, etc.).
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Overview
We are seeking an experienced Senior Solution Architect to join our dynamic DevOps organization. The ideal candidate will have a strong background in cloud technologies, with expertise in migration projects across platforms such as GCP, AWS, and Azure. The candidate should possess a deep understanding of DevOps principles, Kubernetes orchestration, Data migration & management and automation tools like CI/CD pipelines and Terraform.The individual should be highly skilled in designing scalable application architectures capable of handling substantial workloads while ensuring the highest standards of quality.
Key Responsibilities
- Lead and drive cloud migration projects from on-premises data centers or other cloud platforms to GCP, AWS, or Azure.
- Design and implement migration strategies that ensure minimal downtime and maximum efficiency.
- Demonstrate proficiency in GCP, AWS, and Azure, with the ability to choose and optimize solutions based on specific business requirements.
- Provide guidance on selecting the appropriate cloud services for various workloads.
- Design, implement, and optimize CI/CD pipelines to streamline software delivery.
- Utilize Terraform for infrastructure as code (IaC) to automate deployment processes.
- Collaborate with development and operations teams to enhance the overall DevOps culture.
- Possess in-depth knowledge and practical experience with Kubernetes orchestration for containerized applications.
- Architect and optimize Kubernetes clusters for high availability and scalability.
- Engage in research and development activities to stay abreast of industry trends and emerging technologies.
- Evaluate and introduce new tools and methodologies to enhance the efficiency and effectiveness of cloud solutions.
- Architect solutions that can handle large-scale workloads and provide guidance on scaling strategies.
- Ensure high-performance levels and reliability in production environments.
- Design scalable and high-performance database architectures tailored to meet business needs.
- Execute database migrations with a keen focus on data consistency, integrity, and performance.
- Develop and implement database pipelines to automate processes such as data migrations, schema changes, and backups.
- Optimize database workflows to enhance efficiency and reliability.
- Work closely with clients to assess and enhance the quality of existing architectures.
- Implement best practices to ensure robust, secure, and well-architected solutions.
- Drive migration projects, collaborating with cross-functional teams to ensure successful execution.
- Provide technical leadership and mentorship to junior team members.
Required Skills and Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or related field.
- Relevant industry experience in a Solution Architect role.
- Proven experience in leading cloud migration projects across GCP, AWS, and Azure.
- Expertise in DevOps practices, CI/CD pipelines, and infrastructure automation.
- In-depth knowledge of Kubernetes and container orchestration.
- Strong background in scaling architectures to handle significant workloads.
- Sound knowledge in database migrations
- Excellent communication skills and the ability to articulate complex technical concepts to both technical and non-technical stakeholders.
Similar companies
About the company
Jobs
4
About the company
At LearnTube, we're reimagining how the world learns making education accessible, affordable, and outcome-driven using Generative AI. Our platform turns scattered internet content into structured, personalised learning journeys using:
- AI-powered tutors that teach live, solve doubts instantly, and give real-time feedback
- Frictionless delivery via WhatsApp, mobile, and web
- Trusted by 2.2 million learners across 64 countries
Founded by Shronit Ladhani and Gargi Ruparelia, both second-time entrepreneurs and ed-tech builders:
- Shronit is a TEDx speaker and an outspoken advocate for disrupting traditional learning systems.
- Gargi is one of the Top Women in AI in India, recognised by the government, and leads our AI and scalability roadmap.
Together, they bring deep product thinking, bold storytelling, and executional clarity to LearnTube’s vision. LearnTube is proudly backed by Google as part of their 2024 AI First Accelerator, giving us access to cutting-edge tech, mentorship, and cloud credits.
Jobs
1
About the company
Jobs
16
About the company
Jobs
9
About the company
We are a founder-led, early-stage startup from Russia building a visual AI workflow platform for image and video creation, automation, and scalable media processing. We are currently in the pre-launch phase and are recruiting a small core team to focus on product development, back-end, front-end, DevOps, and AI content workflows.
Jobs
1
About the company
Jobs
1
About the company
Jobs
11
About the company
Jobs
3






