
Opportunity for Unix Developer!!
We at Datametica are looking for talented Unix engineers who would get trained and will get the opportunity to work on Google Cloud Platform, DWH and Big Data.
Experience - 2 to 7 years
Job location - Pune
Mandatory Skills:
Strong experience in Unix with Shell Scripting development.
What opportunities do we offer?
-Selected candidates will be provided training opportunities in one or more of following: Google Cloud, AWS, DevOps Tools and Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- You would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- You will play an active role in setting up the Modern data platform based on Cloud and Big Data
- You would be a part of teams with rich experience in various aspects of distributed systems and computing.

About DataMetica
About
Company video


Photos
Connect with the team
Similar jobs
JOB DETAILS:
* Job Title: Lead I - Software Engineering-Kotlin, Java, Spring Boot, Aws
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 5 -7 years
* Location: Trivandrum, Thiruvananthapuram
Role Proficiency:
Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities
Skill Examples:
- Explain and communicate the design / development to the customer
- Perform and evaluate test results against product specifications
- Break down complex problems into logical components
- Develop user interfaces business software components
- Use data models
- Estimate time and effort required for developing / debugging features / components
- Perform and evaluate test in the customer or target environment
- Make quick decisions on technical/project related challenges
- Manage a Team mentor and handle people related issues in team
- Maintain high motivation levels and positive dynamics in the team.
- Interface with other teams’ designers and other parallel practices
- Set goals for self and team. Provide feedback to team members
- Create and articulate impactful technical presentations
- Follow high level of business etiquette in emails and other business communication
- Drive conference calls with customers addressing customer questions
- Proactively ask for and offer help
- Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks.
- Build confidence with customers by meeting the deliverables on time with quality.
- Estimate time and effort resources required for developing / debugging features / components
- Make on appropriate utilization of Software / Hardware’s.
- Strong analytical and problem-solving abilities
Knowledge Examples:
- Appropriate software programs / modules
- Functional and technical designing
- Programming languages – proficient in multiple skill clusters
- DBMS
- Operating Systems and software platforms
- Software Development Life Cycle
- Agile – Scrum or Kanban Methods
- Integrated development environment (IDE)
- Rapid application development (RAD)
- Modelling technology and languages
- Interface definition languages (IDL)
- Knowledge of customer domain and deep understanding of sub domain where problem is solved
Additional Comments:
We are seeking an experienced Senior Backend Engineer with strong expertise in Kotlin and Java to join our dynamic engineering team.
The ideal candidate will have a deep understanding of backend frameworks, cloud technologies, and scalable microservices architectures, with a passion for clean code, resilience, and system observability.
You will play a critical role in designing, developing, and maintaining core backend services that power our high-availability e-commerce and promotion platforms.
Key Responsibilities
Design, develop, and maintain backend services using Kotlin (JVM, Coroutines, Serialization) and Java.
Build robust microservices with Spring Boot and related Spring ecosystem components (Spring Cloud, Spring Security, Spring Kafka, Spring Data).
Implement efficient serialization/deserialization using Jackson and Kotlin Serialization. Develop, maintain, and execute automated tests using JUnit 5, Mockk, and ArchUnit to ensure code quality.
Work with Kafka Streams (Avro), Oracle SQL (JDBC, JPA), DynamoDB, and Redis for data storage and caching needs. Deploy and manage services in AWS environment leveraging DynamoDB, Lambdas, and IAM.
Implement CI/CD pipelines with GitLab CI to automate build, test, and deployment processes.
Containerize applications using Docker and integrate monitoring using Datadog for tracing, metrics, and dashboards.
Define and maintain infrastructure as code using Terraform for services including GitLab, Datadog, Kafka, and Optimizely.
Develop and maintain RESTful APIs with OpenAPI (Swagger) and JSON API standards.
Apply resilience patterns using Resilience4j to build fault-tolerant systems.
Adhere to architectural and design principles such as Domain-Driven Design (DDD), Object-Oriented Programming (OOP), and Contract Testing (Pact).
Collaborate with cross-functional teams in an Agile Scrum environment to deliver high-quality features.
Utilize feature flagging tools like Optimizely to enable controlled rollouts.
Mandatory Skills & Technologies Languages:
Kotlin (JVM, Coroutines, Serialization),
Java Frameworks: Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data)
Serialization: Jackson, Kotlin Serialization
Testing: JUnit 5, Mockk, ArchUnit
Data: Kafka (Avro) Streams Oracle SQL (JDBC, JPA) DynamoDB (NoSQL) Redis (Caching)
Cloud: AWS (DynamoDB, Lambda, IAM)
CI/CD: GitLab CI Containers: Docker
Monitoring & Observability: Datadog (Tracing, Metrics, Dashboards, Monitors)
Infrastructure as Code: Terraform (GitLab, Datadog, Kafka, Optimizely)
API: OpenAPI (Swagger), REST API, JSON API
Resilience: Resilience4j
Architecture & Practices: Domain-Driven Design (DDD) Object-Oriented Programming (OOP) Contract Testing (Pact) Feature Flags (Optimizely)
Platforms: E-Commerce Platform (CommerceTools), Promotion Engine (Talon.One)
Methodologies: Scrum, Agile
Skills: Kotlin, Java, Spring Boot, Aws
Must-Haves
Kotlin (JVM, Coroutines, Serialization), Java, Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data), AWS (DynamoDB, Lambda, IAM), Microservices Architecture
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Trivandrum
Virtual Weekend Interview on 7th Feb 2026 - Saturday
6 + years of hands-on development experience and in-depth knowledge of , Spring Java, Spring boot, Quarkus and nice to have front-end technologies like Angular, React JS
● Excellent Engineering skills in designing and implementing scalable solutions
● Good knowledge of CI/CD Pipeline with strong focus on TDD
● Strong communication skills and ownership
● Exposure to Cloud, Kubernetes, Docker, Microservices is highly desired.
● Experience in working on public cloud environments like AWS, Azure, GCP w.r.t. solutions development, deployment & adoption of cloud-based technology components like IaaS / PaaS offerings
● Proficiency in PL/SQL and Database development.
Strong in J2EE & OOPS Design Patterns.
Role: DevOps Engineer
Exp: 4 - 7 Years
CTC: up to 28 LPA
Key Responsibilities
• Design, build, and manage scalable infrastructure on cloud platforms (GCP, AWS, Azure, or OCI)
• Administer and optimize Kubernetes clusters and container runtimes (Docker, containerd)
• Develop and maintain CI/CD pipelines for multiple services and environments
• Manage infrastructure as code using tools like Terraform and/or Pulumi
• Automate operations with Python and shell scripting for deployment, monitoring, and maintenance
• Ensure high availability and performance of production systems and troubleshoot incidents effectively
• Monitor system metrics and implement observability best practices using tools like Prometheus, Grafana, ELK, etc.
• Collaborate with development, security, and product teams to align infrastructure with business needs
• Apply best practices in cloud networking, Linux administration, and configuration management
• Support compliance and security audits; assist with implementation of cloud security measures (e.g., firewalls, IDS/IPS, IAM hardening)
• Participate in on-call rotations and incident response activities
If Interested kindly share your updated resume on 82008 31681
Integration Developer
ROLE TITLE
Integration Developer
ROLE LOCATION(S)
Bangalore/Hyderabad/Chennai/Coimbatore/Noida/Kolkata/Pune/Indore
ROLE SUMMARY
The Integration Developer is a key member of the operations team, responsible for ensuring the smooth integration and functioning of various systems and software within the organization. This role involves technical support, system troubleshooting, performance monitoring, and assisting with the implementation of integration solutions.
ROLE RESPONSIBILITIES
· Design, develop, and maintain integration solutions using Spring Framework, Apache Camel, and other integration patterns such as RESTful APIs, SOAP services, file-based FTP/SFTP, and OAuth authentication.
· Collaborate with architects and cross-functional teams to design integration solutions that are scalable, secure, and aligned with business requirements.
· Resolve complex integration issues, performance bottlenecks, and data discrepancies across multiple systems. Support Production issues and fixes.
· Document integration processes, technical designs, APIs, and workflows to ensure clarity and ease of use.
· Participate in on-call rotation to provide 24/7 support for critical production issues.
· Develop source code / version control management experience in a collaborative work environment.
TECHNICAL QUALIFICATIONS
· 5+ years of experience in Java development with strong expertise in Spring Framework and Apache Camel for building enterprise-grade integrations.
· Proficient with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Hands-on experience with RESTful APIs, SOAP services, and file-based integrations using FTP and SFTP protocols.
· Strong analytical and troubleshooting skills for resolving complex integration and system issues.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
· Good understanding of containerization and cloud-native development, with experience in using Docker, Kubernetes, and Azure AKS.
· Experience with OAuth for secure authentication and authorization in integration solutions.
· Strong experience level using GitHub Source Control application.
· Strong background in SQL databases (e.g., T-SQL, Stored Procedures) and working with data in an integration context.
· Skilled with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
GENERAL QUALIFICATIONS
· Excellent analytical and problem-solving skills, with a keen attention to detail.
· Effective communication skills, with the ability to collaborate with technical and non-technical stakeholders.
· Experience working in a fast paced, production support environment with a focus on incident management and resolution.
· Experience in the insurance domain is considered a plus.
EDUCATION REQUIREMENTS
· Bachelor’s degree in Computer Science, Information Technology, or related field.
Azure DE
Primary Responsibilities -
- Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage.
- Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure Create data models for analytics purposes
- Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations
- Use Azure Data Factory and Databricks to assemble large, complex data sets
- Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data.
- Ensure data security and compliance
- Collaborate with data engineers, and other stakeholders to understand requirements and translate them into scalable and reliable data platform architectures
Required skills:
- Blend of technical expertise, analytical problem-solving, and collaboration with cross-functional teams
- Azure DevOps
- Apache Spark, Python
- SQL proficiency
- Azure Databricks knowledge
- Big data technologies
The DEs should be well versed in coding, spark core and data ingestion using Azure. Moreover, they need to be decent in terms of communication skills. They should also have core Azure DE skills and coding skills (pyspark, python and SQL).
Out of the 7 open demands, 5 of The Azure Data Engineers should have minimum 5 years of relevant Data Engineering experience.
Roles and Responsibilities
• Ability to create solution prototype and conduct proof of concept of new tools.
• Work in research and understanding of new tools and areas.
• Clearly articulate pros and cons of various technologies/platforms and perform
detailed analysis of business problems and technical environments to derive a
solution.
• Optimisation of the application for maximum speed and scalability.
• Work on feature development and bug fixing.
Technical skills
• Must have knowledge of the networking in Linux, and basics of computer networks in
general.
• Must have intermediate/advanced knowledge of one programming language,
preferably Python.
• Must have experience of writing shell scripts and configuration files.
• Should be proficient in bash.
• Should have excellent Linux administration capabilities.
• Working experience of SCM. Git is preferred.
• Knowledge of build and CI-CD tools, like Jenkins, Bamboo etc is a plus.
• Understanding of Architecture of OpenStack/Kubernetes is a plus.
• Code contributed to OpenStack/Kubernetes community will be a plus.
• Data Center network troubleshooting will be a plus.
• Understanding of NFV and SDN domain will be a plus.
Soft skills
• Excellent verbal and written communications skills.
• Highly driven, positive attitude, team player, self-learning, self motivating and flexibility
• Strong customer focus - Decent Networking and relationship management
• Flair for creativity and innovation
• Strategic thinking This is an individual contributor role and will need client interaction on technical side.
Must have Skill - Linux, Networking, Python, Cloud
Additional Skills-OpenStack, Kubernetes, Shell, Java, Development
Quoality- A Modern Operating Infrastructure for Hospitality Businesses to help them improve the guest experience and generate additional revenue.
Launched in 2021, Quoality is a Hospitality Tech company backed by a US-based Newchip Accelerator. Our mission is to equip businesses with the tools they need to grow.
By providing tools, insights, and education with minimal effort on the business's part, the barrier to making actionable business decisions is lowered dramatically. Everything we do is driven by this mission. Everything we do needs to positively answer the question, "Does this help businesses grow?"
Product Demo: https://www.youtube.com/watch?v=gKklkRYKkC4&t" target="_blank">https://www.youtube.com/watch?v=gKklkRYKkC4
Quoality Blogosphere: https://medium.com/quoality" target="_blank">https://medium.com/quoality
🙋♀️ The role & what we are looking for?
Oh, this is our favorite part. We’ve seen companies that write 'templatized' BS for this section; stuff that’s never followed in reality.
Not us. Here are some of our core values to give you a sense of who we are as a team:
- First and foremost, we’re super transparent. We all know what everyone’s working on, how’s the company doing, and the whole shebang.
- We’re all super chill.
- Work time ain’t playtime. We take our work very seriously. We’re a small bunch on a mission to change the guest experience in the hospitality industry, once and for all.
- Deadlines are a thing for us. We are super punctual.
- We know what we are working on and aren’t clueless.
- We’re more friends than colleagues.
- Monthly 1:1s with the founders where you can directly share & receive feedback.
Not convinced yet? Okay, here’s the fun stuff:
- Flexible leave policy. Yeah, it’s truly flexible. We trust you.
- We have a flexible remote work policy in India. Work from the beautiful beaches of Goa or the stunning mountains of Manali, we’re all good.
- Annual workstations at exotic destinations. Subject to change and hitting milestones.
- Allowances to choose your coworking space (if you want), courses & any tool that you think is beneficial to succeed in this role.
📄 The role & what we are looking for:
🧑💻 Responsibilities:
- Write code that will impact the businesses of thousands of hotels, hostels, and vacation homes across the globe.
- Collaborate with Frontend to spec, write, test, and deploy API endpoints.
- Implementation of a robust set of services and APIs that work on our data pipelines.
- Build all support infrastructure to scale our data delivery pipelines - endpoints, security, logging, messaging.
- Product development activity includes data querying from our various stores, real-time analytics, ML algorithms.
- Developing & Integrating of the front-end and back-end aspects of the web application
- Optimization of the application for maximum speed and scalability.
- Brainstorm features with the product team and guides decisions based on your knowledge of the codebase.
- Take full ownership of the module starting from architectural and design decisions to shipping.
- Like the work you do, enjoy collaborating with your coworkers, communicate as much as you can, and have fun.
- Upto years of relevant work experience.
- Understanding the nature of asynchronous programming and its quirks and workarounds.
- You possess strong computer science fundamentals: data structures, algorithms, distributed systems, and information retrieval.
- You have a good understanding of multi-process architecture and the threading limitations of NodeJs and Express frameworks.
- Understanding of fundamental design principles behind a scalable application.
- You have basic understanding of relational databases as well as key-value databases and are capable of designing scalable database models based on the product requirements.
- Familiarity with REST API development standards, protocols (HTTP, WebSockets and more)
- You’re comfortable picking up new stacks and choosing the right tool to get the job done.
- You are open to learning new stuff and avoid internal politics. (super important).
- You are humble, kind, and are open to feedback.(super duper important).
❕Bonus:
- Entrepreneurial spirit, Product Thinker, and ‘Can-Do' attitude.
- Passionate about leveraging technology for supporting product delivery in the Hospitality space.
- Operating style suited to working in a startup environment, where teamwork and resourcefulness are highly valued.
- Excellent leadership skills, including the ability to manage multiple assignments at a time.
🤙 What's the interview process like?
Yes, we do have a process, and it’s simple.
- Step 1: You apply for the job.
- Step 2: Quick intro call with the founders.
- Step 3: Solve an assignment.
- Step 4: A round where we evaluate your assignment along with you.
- Step 5: One round of interviews to gauge if you are a good culture fit.
- Step 6: Job Offer (contingent on the above steps).
- Step 7: Welcome to the Quoality! 🙂
So, think we’re a good fit? Then let’s roll.🤘
A backend developer works alongside front-end developers, full stack developers, programmers or UX specialists to create comprehensive digital solutions for business needs. They make sure the website is scalable and that it can still function when hit by large loads—be that traffic or demanding scripts. They are also tasked with
maintaining and testing existing back-end elements to ensure they’re as fast and efficient as possible. Data storage also comes under their remit, which requires them to have a good knowledge of data security and compliance.
Duties and responsibilities:
• Analyses, designs, develops, and codes REST API using NodeJS more programming languages, Web and Rich Internet Applications.
• Supports applications with an understanding of system integration, test planning, scripting, and troubleshooting.
• Assesses the health and performance of software applications and databases.
• Establishes, participates, and maintains relationships with business units, customers and subject matter experts to remain apprised of direction, project status, architectural and technology trends, risks, and functional/integration issues.
• Defines specifications and develop programs, modifies existing programs, prepares test data, and prepares functional specifications.
• Analyses program and application performance using various programming languages, tools and techniques.
• Manage staff of 2-3 & manage microservices of software most effectively and efficiently.
• Reviews project proposals evaluate alternatives, provides estimates and makes recommendations.
• Designs and defines specifications for systems.
• Identifies potential process improvement areas and suggests options and recommends approaches
System Design Concepts:
- Scaling
- API Gateway
- Microservices
- Message Queue
Qualifications:
Graduation - B. Com + IT Certification/ B.Sc. - IT / B.Sc. Comp Sc./ B.Sc. IT/ BE-IT/ BE-Comps
Technical Skills:
Oracle, SQL, Java, Node JS, REST API Debugging.
Soft Skills:
Client Management; Good interpersonal skills; Good Communication
- Strong programming skills in Javascript/TypeScript, Python or Go.
- Hands-on experience in API development and frameworks such Express, Loopback, Hapi.
- Good Understanding of SQL and NoSQL database.
- Experience in test driven development. (writing unit test and API test).
- Understanding of basic cloud computing concepts and experience in using any of the major cloud service providers(AWS/GCP/Azure).
- Ability to build and deploy the application in a containerized environment.
- Understanding of application logging and monitoring systems like Prometheus or Kibana.
Qualification:
- B.E/B.Tech/M.E./M.Tech/M.S. from a reputed university with a good academic record.
- Curiosity to explore the cutting edge technologies and bake them in the products.
- Zeal and drive to take end to end ownership.
















