
About Teradata
About
Connect with the team
Similar jobs
Required Skills: CI/CD Pipeline, Data Structures, Microservices, Determining overall architectural principles, frameworks and standards, Cloud expertise (AWS, GCP, or Azure), Distributed Systems
Criteria:
- Candidate must have 6+ years of backend engineering experience, with 1–2 years leading engineers or owning major systems.
- Must be strong in one core backend language: Node.js, Go, Java, or Python.
- Deep understanding of distributed systems, caching, high availability, and microservices architecture.
- Hands-on experience with AWS/GCP, Docker, Kubernetes, and CI/CD pipelines.
- Strong command over system design, data structures, performance tuning, and scalable architecture
- Ability to partner with Product, Data, Infrastructure, and lead end-to-end backend roadmap execution.
Description
What This Role Is All About
We’re looking for a Backend Tech Lead who’s equally obsessed with architecture decisions and clean code, someone who can zoom out to design systems and zoom in to fix that one weird memory leak. You’ll lead a small but sharp team, drive the backend roadmap, and make sure our systems stay fast, lean, and battle-tested.
What You’ll Own
● Architect backend systems that handle India-scale traffic without breaking a sweat.
● Build and evolve microservices, APIs, and internal platforms that our entire app depends on.
● Guide, mentor, and uplevel a team of backend engineers—be the go-to technical brain.
● Partner with Product, Data, and Infra to ship features that are reliable and delightful.
● Set high engineering standards—clean architecture, performance, automation, and testing.
● Lead discussions on system design, performance tuning, and infra choices.
● Keep an eye on production like a hawk: metrics, monitoring, logs, uptime.
● Identify gaps proactively and push for improvements instead of waiting for fires.
What Makes You a Great Fit
● 6+ years of backend experience; 1–2 years leading engineers or owning major systems.
● Strong in one core language (Node.js / Go / Java / Python) — pick your sword.
● Deep understanding of distributed systems, caching, high-availability, and microservices.
● Hands-on with AWS/GCP, Docker, Kubernetes, CI/CD pipelines.
● You think data structures and system design are not interviews — they’re daily tools.
● You write code that future-you won’t hate.
● Strong communication and a let’s figure this out attitude.
Bonus Points If You Have
● Built or scaled consumer apps with millions of DAUs.
● Experimented with event-driven architecture, streaming systems, or real-time pipelines.
● Love startups and don’t mind wearing multiple hats.
● Experience on logging/monitoring tools like Grafana, Prometheus, ELK, OpenTelemetry.
Why company Might Be Your Best Move
● Work on products used by real people every single day.
● Ownership from day one—your decisions will shape our core architecture.
● No unnecessary hierarchy; direct access to founders and senior leadership.
● A team that cares about quality, speed, and impact in equal measure.
● Build for Bharat — complex constraints, huge scale, real impact.
Hybrid work mode
(Azure) EDW Experience working in loading Star schema data warehouses using framework
architectures including experience loading type 2 dimensions. Ingesting data from various
sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.
Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),
Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.
Wissen Technology is hiring for Devops engineer
Required:
-4 to 10 years of relevant experience in Devops
-Must have hands on experience on AWS, Kubernetes, CI/CD pipeline
-Good to have exposure on Github or Gitlab
-Open to work from hashtag Chennai
-Work mode will be Hybrid
Company profile:
Company Name : Wissen Technology
Group of companies in India : Wissen Technology & Wissen Infotech
Work Location - Chennai
Website : www.wissen.com
Wissen Thought leadership : https://lnkd.in/gvH6VBaU
LinkedIn: https://lnkd.in/gnK-vXjF
Overview
adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.
Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.
Job Description
We are seeking a skilled Cloud Data Engineer who has experience with cloud data platforms like AWS or Azure and especially Snowflake and dbt to join our dynamic team. As a consultant, you will be responsible for developing new data platforms and create the data processes. You will collaborate with cross-functional teams to design, develop, and deploy high-quality frontend solutions.
Responsibilities:
Customer consulting: You develop data-driven products in the Snowflake Cloud and connect data & analytics with specialist departments. You develop ELT processes using dbt (data build tool)
Specifying requirements: You develop concrete requirements for future-proof cloud data architectures.
Develop data routes: You design scalable and powerful data management processes.
Analyze data: You derive sound findings from data sets and present them in an understandable way.
Requirements:
Requirements management and project experience: You successfully implement cloud-based data & analytics projects.
Data architectures: You are proficient in DWH/data lake concepts and modeling with Data Vault 2.0.
Cloud expertise: You have extensive knowledge of Snowflake, dbt and other cloud technologies (e.g. MS Azure, AWS, GCP).
SQL know-how: You have a sound and solid knowledge of SQL.
Data management: You are familiar with topics such as master data management and data quality.
Bachelor's degree in computer science, or a related field.
Strong communication and collaboration abilities to work effectively in a team environment.
Skills & Requirements
Cloud Data Engineering, AWS, Azure, Snowflake, dbt, ELT processes, Data-driven consulting, Cloud data architectures, Scalable data management, Data analysis, Requirements management, Data warehousing, Data lake, Data Vault 2.0, SQL, Master data management, Data quality, GCP, Strong communication, Collaboration.
The ideal candidate will be responsible for creating, installing and managing our databases. You will ensure optimal database performance by analyzing database issues and monitoring database performance.
What will I be doing?
- A professional Database Administrator (DBA) will be responsible for the architecture, performance, integrity, and security of a database.
- Design and build highly available microsoft MySQL and SQL solutions, making use of log shipping, mirroring and always on technologies
- Ensure performance, security, and availability of databases
- Prepare documentation and specifications
- Handle common database procedures such as upgrade, backup, recovery, migration, etc.
- Design, create and implement database systems based on the end user's requirements.
- Collaborate with other team members and stakeholders
- Collaborate with the engineering and production support teams to create forward-looking service improvement plans and roadmaps.
- Installation, configuration and upgrading of application software and related products
What Skill do I need ?
- Minimum 5 years of experience in database administration, information technology, database architecture, or a related field
- Installation, configuration and upgrading of MS SQL/Oracle/MySQL/ Server software and related products
- Strong proficiency with Microsoft SQL, and ideally its variants in other popular RDBMS
- Skilled at optimizing large complicated SQL statements
- Proven working experience as a Database Administrator and expert in MongoDB
- Install server software, configure database servers, monitor and maintain system health and security
- Excellent knowledge of data backup, recovery, security, integrity
- Familiarity with database design, documentation and coding
- Previous experience with DBA case tools (frontend/backend) and third-party tools
- Problem solving skills and ability to think algorithmically
- Experience with large databases and complex process models
Required Qualification :
- Bachelor's Degree or equivalent experience in Computer Science, Technology, or a related field of study
- Proven knowledge of SQL Servers, Cpanels, MySQL
- Strong analytical, problem-solving, and decision-making skills
- Advanced working SQL and MySQL knowledge and experience working with relational databases, query authoring (SQL)
- Excellent knowledge of Performance tuning, DB administration, data backup, recovery, security, integrity
- Experience working with both structured and unstructured data.
- Excellent problem-solving and analytical skills
- Good communication, teamwork and negotiation skills
- 3 - 6 years of software development, and operations experience deploying and maintaining multi-tiered infrastructure and applications at scale.
- Design cloud infrastructure that is secure, scalable, and highly available on AWS
- Experience managing any distributed NoSQL system (Kafka/Cassandra/etc.)
- Experience with Containers, Microservices, deployment and service orchestration using Kubernetes, EKS (preferred), AKS or GKE.
- Strong scripting language knowledge, such as Python, Shell
- Experience and a deep understanding of Kubernetes.
- Experience in Continuous Integration and Delivery.
- Work collaboratively with software engineers to define infrastructure and deployment requirements
- Provision, configure and maintain AWS cloud infrastructure
- Ensure configuration and compliance with configuration management tools
- Administer and troubleshoot Linux-based systems
- Troubleshoot problems across a wide array of services and functional areas
- Build and maintain operational tools for deployment, monitoring, and analysis of AWS infrastructure and systems
- Perform infrastructure cost analysis and optimization
- AWS
- Docker
- Kubernetes
- Envoy
- Istio
- Jenkins
- Cloud Security & SIEM stacks
- Terraform
Senior DevOps Engineer (8-12 yrs Exp)
Job Description:
We are looking for an experienced and enthusiastic DevOps Engineer. As our new DevOps
Engineer, you will be in charge of the specification and documentation of the new project
features. In addition, you will be developing new features and writing scripts for automation
using Java/BitBucket/Python/Bash.
Roles and Responsibilities:
• Deploy updates and fixes
• Utilize various open source technologies
• Need to have hands on experience on automation tools like Docker / Jenkins /
Puppet etc.
• Build independent web based tools, micro-services and solutions
• Write scripts and automation using Java/BitBucket/Python/Bash.
• Configure and manage data sources like MySQL, Mongo, Elastic search, Redis etc
• Understand how various systems work
• Manage code deployments, fixes, updates and related processes.
• Understand how IT operations are managed
• Work with CI and CD tools, and source control such as GIT and SVN.
• Experience with project management and workflow tools such as Agile, Redmine,
WorkFront, Scrum/Kanban/SAFe, etc.
• Build tools to reduce occurrences of errors and improve customer experience
• Develop software to integrate with internal back-end systems
• Perform root cause analysis for production errors
• Design procedures for system troubleshooting and maintenance
Requirements:
• More than six years of experience in a DevOps Engineer role (or similar role);
experience in software development and infrastructure development is a mandatory.
• Bachelor’s degree or higher in engineering or related field
• Proficiency in deploying and maintaining web applications
• Ability to construct and execute network, server, and application status monitoring
• Knowledge of software automation production systems, including code deployment
• Working knowledge of software development methodologies
• Previous experience with high-performance and high-availability open source web
technologies
• Strong experience with Linux-based infrastructures, Linux/Unix administration, and
AWS.
• Strong communication skills and ability to explain protocol and processes with team
and management.
• Solid team player.
• Microsoft: http://asp.net/" target="_blank">ASP.NET Core, Azure, C#/Web API
- Proficient in Java, Node or Python
- Experience with NewRelic, Splunk, SignalFx, DataDog etc.
- Monitoring and alerting experience
- Full stack development experience
- Hands-on with building and deploying micro services in Cloud (AWS/Azure)
- Experience with terraform w.r.t Infrastructure As Code
- Should have experience troubleshooting live production systems using monitoring/log analytics tools
- Should have experience leading a team (2 or more engineers)
- Experienced using Jenkins or similar deployment pipeline tools
- Understanding of distributed architectures
- Write, test, debug and ship code and gather feedback on the scale, performance, security to incorporate back into the platform.
- Work with the founders to identify complex technical problems and solve them.
- Work with the product design and client experience development team to support
them with scalable services
- Feed into the overall mission and vision of the eParchi’s platform over the period of the coming months and years.
- An ability to perform well in a fast-paced environment
- Excellent analytical and multitasking skills.









